Compare commits
103 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
09c535f159 | ||
|
|
7fb11da5df | ||
|
|
9c9a46499a | ||
|
|
6fca60261e | ||
|
|
00537b32ef | ||
|
|
8636758a90 | ||
|
|
e39dfbd624 | ||
|
|
6e048b2a12 | ||
|
|
f9657599c2 | ||
|
|
42ae3bba9b | ||
|
|
2fd56a4bfe | ||
|
|
824bcc5603 | ||
|
|
4fbb613aaa | ||
|
|
9eb45270f2 | ||
|
|
75c61c53e8 | ||
|
|
2688a47436 | ||
|
|
fe3bf4b189 | ||
|
|
456cb5ebb2 | ||
|
|
3d91d574b4 | ||
|
|
54876c5499 | ||
|
|
d256585284 | ||
|
|
bd8f100b43 | ||
|
|
44f05f2dcc | ||
|
|
43f7f82bdc | ||
|
|
e902f63211 | ||
|
|
129f68e194 | ||
|
|
4b37fe12d7 | ||
|
|
6de79922c5 | ||
|
|
e1a9791f44 | ||
|
|
81795f51c6 | ||
|
|
68dfb11155 | ||
|
|
39fc1beb89 | ||
|
|
fe0ddec0f9 | ||
|
|
9b52b4efd9 | ||
|
|
e90e527603 | ||
|
|
a510854741 | ||
|
|
8935ce4ccf | ||
|
|
f9edc9059a | ||
|
|
db8917a769 | ||
|
|
c2d70cc1c2 | ||
|
|
3b13c7f9ce | ||
|
|
b7150d8026 | ||
|
|
041830a7f8 | ||
|
|
a18daf0195 | ||
|
|
5d3dfceb22 | ||
|
|
c82855e732 | ||
|
|
956f156018 | ||
|
|
9b13c35e7f | ||
|
|
bc8e637bba | ||
|
|
f03c28c906 | ||
|
|
e4b1f39fdc | ||
|
|
4780af910c | ||
|
|
d61ce5c524 | ||
|
|
20ab151f4d | ||
|
|
8a7be7543a | ||
|
|
3f806aec9c | ||
|
|
6c273b32bb | ||
|
|
b986f9d6ee | ||
|
|
c98cca6b7b | ||
|
|
fbec78ede5 | ||
|
|
c1d9a2d1f1 | ||
|
|
8a10036f32 | ||
|
|
924a3aec0e | ||
|
|
3b3ac31541 | ||
|
|
e0cb2f9d0f | ||
|
|
549b4edb59 | ||
|
|
67c912aca2 | ||
|
|
a74dde5d9e | ||
|
|
f7bcd24726 | ||
|
|
337c900770 | ||
|
|
e83e73ead4 | ||
|
|
24f6f9b063 | ||
|
|
5dc999360e | ||
|
|
9ec2f6b64d | ||
|
|
f970592efe | ||
|
|
7592c11e99 | ||
|
|
759b05e137 | ||
|
|
42ebd9ffce | ||
|
|
bc0fc33966 | ||
|
|
f4aab16e39 | ||
|
|
e91425287c | ||
|
|
f05908f570 | ||
|
|
8b351edf9c | ||
|
|
93c06eaba0 | ||
|
|
a8d9fa75d4 | ||
|
|
159ecd3e4f | ||
|
|
717803c665 | ||
|
|
0d40589e8a | ||
|
|
8c5544bfad | ||
|
|
0c9be9f84f | ||
|
|
497729ecd6 | ||
|
|
21a8efa3b8 | ||
|
|
c2f942a51e | ||
|
|
63b4b95240 | ||
|
|
955f37e005 | ||
|
|
cd2ae89b0e | ||
|
|
0b013fa438 | ||
|
|
478b657354 | ||
|
|
65b6aabe69 | ||
|
|
3fabae5b5f | ||
|
|
96c46a9e12 | ||
|
|
381b93e8eb | ||
|
|
f51e5b6fbf |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -42,3 +42,6 @@ api/tacticalrmm/accounts/management/commands/random_data.py
|
||||
versioninfo.go
|
||||
resource.syso
|
||||
htmlcov/
|
||||
docker-compose.dev.yml
|
||||
docs/.vuepress/dist
|
||||
nats-rmm.conf
|
||||
|
||||
43
.travis.yml
43
.travis.yml
@@ -1,43 +0,0 @@
|
||||
dist: focal
|
||||
|
||||
matrix:
|
||||
include:
|
||||
- language: node_js
|
||||
node_js: "12"
|
||||
before_install:
|
||||
- cd web
|
||||
install:
|
||||
- npm install
|
||||
script:
|
||||
- npm run test:unit
|
||||
|
||||
- language: python
|
||||
python: "3.8"
|
||||
services:
|
||||
- redis
|
||||
|
||||
addons:
|
||||
postgresql: "13"
|
||||
apt:
|
||||
packages:
|
||||
- postgresql-13
|
||||
|
||||
before_script:
|
||||
- psql -c 'CREATE DATABASE travisci;' -U postgres
|
||||
- psql -c "CREATE USER travisci WITH PASSWORD 'travisSuperSekret6645';" -U postgres
|
||||
- psql -c 'GRANT ALL PRIVILEGES ON DATABASE travisci TO travisci;' -U postgres
|
||||
- psql -c 'ALTER USER travisci CREATEDB;' -U postgres
|
||||
|
||||
before_install:
|
||||
- cd api/tacticalrmm
|
||||
|
||||
install:
|
||||
- pip install --no-cache-dir --upgrade pip
|
||||
- pip install --no-cache-dir setuptools==49.6.0 wheel==0.35.1
|
||||
- pip install --no-cache-dir -r requirements.txt -r requirements-test.txt
|
||||
|
||||
script:
|
||||
- coverage run manage.py test -v 2
|
||||
|
||||
after_success:
|
||||
- coveralls
|
||||
25
README.md
25
README.md
@@ -1,6 +1,5 @@
|
||||
# Tactical RMM
|
||||
|
||||
[](https://travis-ci.com/wh1te909/tacticalrmm)
|
||||
[](https://dev.azure.com/dcparsi/Tactical%20RMM/_build/latest?definitionId=4&branchName=develop)
|
||||
[](https://coveralls.io/github/wh1te909/tacticalrmm?branch=develop)
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
@@ -64,6 +63,7 @@ sudo ufw allow ssh
|
||||
sudo ufw allow http
|
||||
sudo ufw allow https
|
||||
sudo ufw allow proto tcp from any to any port 4505,4506
|
||||
sudo ufw allow proto tcp from any to any port 4222
|
||||
sudo ufw enable && sudo ufw reload
|
||||
```
|
||||
|
||||
@@ -78,7 +78,7 @@ Create A record ```mesh.tacticalrmm.com``` for meshcentral
|
||||
Download the install script and run it
|
||||
|
||||
```
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/install.sh
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/install.sh
|
||||
chmod +x install.sh
|
||||
./install.sh
|
||||
```
|
||||
@@ -92,17 +92,17 @@ chmod +x install.sh
|
||||
From the app's dashboard, choose Agents > Install Agent to generate an installer.
|
||||
|
||||
## Updating
|
||||
Download and run [update.sh](./update.sh) ([Raw](https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/update.sh))
|
||||
Download and run [update.sh](https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/update.sh)
|
||||
```
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/update.sh
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/update.sh
|
||||
chmod +x update.sh
|
||||
./update.sh
|
||||
```
|
||||
|
||||
## Backup
|
||||
Download [backup.sh](./backup.sh) ([Raw](https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/backup.sh))
|
||||
Download [backup.sh](https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh)
|
||||
```
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/backup.sh
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh
|
||||
```
|
||||
Change the postgres username and password at the top of the file (you can find them in `/rmm/api/tacticalrmm/tacticalrmm/local_settings.py` under the DATABASES section)
|
||||
|
||||
@@ -121,7 +121,7 @@ Copy backup file to new server
|
||||
|
||||
Download the restore script, and edit the postgres username/password at the top of the file. Same instructions as above in the backup steps.
|
||||
```
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/restore.sh
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/restore.sh
|
||||
```
|
||||
|
||||
Run the restore script, passing it the backup tar file as the first argument
|
||||
@@ -129,14 +129,3 @@ Run the restore script, passing it the backup tar file as the first argument
|
||||
chmod +x restore.sh
|
||||
./restore.sh rmm-backup-xxxxxxx.tar
|
||||
```
|
||||
|
||||
## Using another ssl certificate
|
||||
During the install you can opt out of using the Let's Encrypt certificate. If you do this the script will create a self-signed certificate, so that https continues to work. You can replace the certificates in /certs/example.com/(privkey.pem | pubkey.pem) with your own.
|
||||
|
||||
If you are migrating from Let's Encrypt to another certificate provider, you can create the /certs directory and copy your certificates there. It is recommended to do this because this directory will be backed up with the backup script provided. Then modify the nginx configurations to use your new certificates
|
||||
|
||||
The cert that is generated is a wildcard certificate and is used in the nginx configurations: rmm.conf, api.conf, and mesh.conf. If you can't generate wildcard certificates you can create a cert for each subdomain and configure each nginx configuration file to use its own certificate. Then restart nginx:
|
||||
|
||||
```
|
||||
sudo systemctl restart nginx
|
||||
```
|
||||
26
api/tacticalrmm/accounts/migrations/0006_user_agent.py
Normal file
26
api/tacticalrmm/accounts/migrations/0006_user_agent.py
Normal file
@@ -0,0 +1,26 @@
|
||||
# Generated by Django 3.1.2 on 2020-11-10 20:24
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("agents", "0024_auto_20201101_2319"),
|
||||
("accounts", "0005_auto_20201002_1303"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="user",
|
||||
name="agent",
|
||||
field=models.OneToOneField(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="user",
|
||||
to="agents.agent",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,25 @@
|
||||
# Generated by Django 3.1.2 on 2020-11-01 22:54
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
def link_agents_to_users(apps, schema_editor):
|
||||
Agent = apps.get_model("agents", "Agent")
|
||||
User = apps.get_model("accounts", "User")
|
||||
for agent in Agent.objects.all():
|
||||
user = User.objects.filter(username=agent.agent_id).first()
|
||||
|
||||
if user:
|
||||
user.agent = agent
|
||||
user.save()
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("accounts", "0006_user_agent"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(link_agents_to_users, migrations.RunPython.noop),
|
||||
]
|
||||
18
api/tacticalrmm/accounts/migrations/0008_user_dark_mode.py
Normal file
18
api/tacticalrmm/accounts/migrations/0008_user_dark_mode.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.3 on 2020-11-12 00:39
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('accounts', '0007_update_agent_primary_key'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='user',
|
||||
name='dark_mode',
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
]
|
||||
@@ -7,6 +7,15 @@ from logs.models import BaseAuditModel
|
||||
class User(AbstractUser, BaseAuditModel):
|
||||
is_active = models.BooleanField(default=True)
|
||||
totp_key = models.CharField(max_length=50, null=True, blank=True)
|
||||
dark_mode = models.BooleanField(default=True)
|
||||
|
||||
agent = models.OneToOneField(
|
||||
"agents.Agent",
|
||||
related_name="user",
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.CASCADE,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def serialize(user):
|
||||
|
||||
@@ -195,6 +195,14 @@ class TestUserAction(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("put", url)
|
||||
|
||||
def test_darkmode(self):
|
||||
url = "/accounts/users/ui/"
|
||||
data = {"dark_mode": False}
|
||||
r = self.client.patch(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
self.check_not_authenticated("patch", url)
|
||||
|
||||
|
||||
class TestTOTPSetup(TacticalTestCase):
|
||||
def setUp(self):
|
||||
|
||||
@@ -7,4 +7,5 @@ urlpatterns = [
|
||||
path("users/reset/", views.UserActions.as_view()),
|
||||
path("users/reset_totp/", views.UserActions.as_view()),
|
||||
path("users/setup_totp/", views.TOTPSetup.as_view()),
|
||||
path("users/ui/", views.UserUI.as_view()),
|
||||
]
|
||||
|
||||
@@ -74,8 +74,7 @@ class LoginView(KnoxLoginView):
|
||||
|
||||
class GetAddUsers(APIView):
|
||||
def get(self, request):
|
||||
agents = Agent.objects.values_list("agent_id", flat=True)
|
||||
users = User.objects.exclude(username__in=agents)
|
||||
users = User.objects.filter(agent=None)
|
||||
|
||||
return Response(UserSerializer(users, many=True).data)
|
||||
|
||||
@@ -157,3 +156,11 @@ class TOTPSetup(APIView):
|
||||
return Response(TOTPSetupSerializer(user).data)
|
||||
|
||||
return Response("totp token already set")
|
||||
|
||||
|
||||
class UserUI(APIView):
|
||||
def patch(self, request):
|
||||
user = request.user
|
||||
user.dark_mode = request.data["dark_mode"]
|
||||
user.save(update_fields=["dark_mode"])
|
||||
return Response("ok")
|
||||
@@ -26,6 +26,7 @@ def get_wmi_data():
|
||||
agent = Recipe(
|
||||
Agent,
|
||||
hostname="DESKTOP-TEST123",
|
||||
version="1.1.0",
|
||||
monitoring_type=cycle(["workstation", "server"]),
|
||||
salt_id=generate_agent_id("DESKTOP-TEST123"),
|
||||
agent_id="71AHC-AA813-HH1BC-AAHH5-00013|DESKTOP-TEST123",
|
||||
|
||||
18
api/tacticalrmm/agents/migrations/0025_auto_20201122_0407.py
Normal file
18
api/tacticalrmm/agents/migrations/0025_auto_20201122_0407.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.3 on 2020-11-22 04:07
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('agents', '0024_auto_20201101_2319'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='recoveryaction',
|
||||
name='mode',
|
||||
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC')], default='mesh', max_length=50),
|
||||
),
|
||||
]
|
||||
18
api/tacticalrmm/agents/migrations/0026_auto_20201125_2334.py
Normal file
18
api/tacticalrmm/agents/migrations/0026_auto_20201125_2334.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.3 on 2020-11-25 23:34
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('agents', '0025_auto_20201122_0407'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='recoveryaction',
|
||||
name='mode',
|
||||
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC'), ('checkrunner', 'Checkrunner')], default='mesh', max_length=50),
|
||||
),
|
||||
]
|
||||
@@ -7,6 +7,7 @@ from Crypto.Random import get_random_bytes
|
||||
from Crypto.Hash import SHA3_384
|
||||
from Crypto.Util.Padding import pad
|
||||
import validators
|
||||
import msgpack
|
||||
import random
|
||||
import re
|
||||
import string
|
||||
@@ -14,6 +15,8 @@ from collections import Counter
|
||||
from loguru import logger
|
||||
from packaging import version as pyver
|
||||
from distutils.version import LooseVersion
|
||||
from nats.aio.client import Client as NATS
|
||||
from nats.aio.errors import ErrTimeout
|
||||
|
||||
from django.db import models
|
||||
from django.conf import settings
|
||||
@@ -82,6 +85,10 @@ class Agent(BaseAuditModel):
|
||||
def client(self):
|
||||
return self.site.client
|
||||
|
||||
@property
|
||||
def has_nats(self):
|
||||
return pyver.parse(self.version) >= pyver.parse("1.1.0")
|
||||
|
||||
@property
|
||||
def timezone(self):
|
||||
# return the default timezone unless the timezone is explicity set per agent
|
||||
@@ -142,11 +149,7 @@ class Agent(BaseAuditModel):
|
||||
|
||||
@property
|
||||
def has_patches_pending(self):
|
||||
|
||||
if self.winupdates.filter(action="approve").filter(installed=False).exists():
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
return self.winupdates.filter(action="approve").filter(installed=False).exists()
|
||||
|
||||
@property
|
||||
def checks(self):
|
||||
@@ -433,6 +436,37 @@ class Agent(BaseAuditModel):
|
||||
except Exception:
|
||||
return "err"
|
||||
|
||||
async def nats_cmd(self, data, timeout=30, wait=True):
|
||||
nc = NATS()
|
||||
options = {
|
||||
"servers": f"tls://{settings.ALLOWED_HOSTS[0]}:4222",
|
||||
"user": "tacticalrmm",
|
||||
"password": settings.SECRET_KEY,
|
||||
"connect_timeout": 3,
|
||||
"max_reconnect_attempts": 2,
|
||||
}
|
||||
try:
|
||||
await nc.connect(**options)
|
||||
except:
|
||||
return "natsdown"
|
||||
|
||||
if wait:
|
||||
try:
|
||||
msg = await nc.request(
|
||||
self.agent_id, msgpack.dumps(data), timeout=timeout
|
||||
)
|
||||
except ErrTimeout:
|
||||
ret = "timeout"
|
||||
else:
|
||||
ret = msgpack.loads(msg.data)
|
||||
|
||||
await nc.close()
|
||||
return ret
|
||||
else:
|
||||
await nc.publish(self.agent_id, msgpack.dumps(data))
|
||||
await nc.flush()
|
||||
await nc.close()
|
||||
|
||||
def salt_api_cmd(self, **kwargs):
|
||||
|
||||
# salt should always timeout first before the requests' timeout
|
||||
@@ -592,10 +626,7 @@ class Agent(BaseAuditModel):
|
||||
return "failed"
|
||||
|
||||
def not_supported(self, version_added):
|
||||
if pyver.parse(self.version) < pyver.parse(version_added):
|
||||
return True
|
||||
|
||||
return False
|
||||
return pyver.parse(self.version) < pyver.parse(version_added)
|
||||
|
||||
def delete_superseded_updates(self):
|
||||
try:
|
||||
@@ -721,6 +752,8 @@ RECOVERY_CHOICES = [
|
||||
("salt", "Salt"),
|
||||
("mesh", "Mesh"),
|
||||
("command", "Command"),
|
||||
("rpc", "Nats RPC"),
|
||||
("checkrunner", "Checkrunner"),
|
||||
]
|
||||
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
from loguru import logger
|
||||
from time import sleep
|
||||
import random
|
||||
@@ -11,6 +12,7 @@ from django.conf import settings
|
||||
from tacticalrmm.celery import app
|
||||
from agents.models import Agent, AgentOutage
|
||||
from core.models import CoreSettings
|
||||
from logs.models import PendingAction
|
||||
|
||||
logger.configure(**settings.LOG_CONFIG)
|
||||
|
||||
@@ -53,14 +55,32 @@ def send_agent_update_task(pks, version):
|
||||
logger.info(
|
||||
f"Updating {agent.salt_id} current version {agent.version} using {inno}"
|
||||
)
|
||||
r = agent.salt_api_async(
|
||||
func="win_agent.do_agent_update_v2",
|
||||
kwargs={
|
||||
"inno": inno,
|
||||
"url": url,
|
||||
},
|
||||
)
|
||||
logger.info(f"{agent.salt_id}: {r}")
|
||||
|
||||
if agent.has_nats:
|
||||
if agent.pendingactions.filter(
|
||||
action_type="agentupdate", status="pending"
|
||||
).exists():
|
||||
continue
|
||||
|
||||
PendingAction.objects.create(
|
||||
agent=agent,
|
||||
action_type="agentupdate",
|
||||
details={
|
||||
"url": agent.winagent_dl,
|
||||
"version": settings.LATEST_AGENT_VER,
|
||||
"inno": agent.win_inno_exe,
|
||||
},
|
||||
)
|
||||
# TODO
|
||||
# Salt is deprecated, remove this once salt is gone
|
||||
else:
|
||||
r = agent.salt_api_async(
|
||||
func="win_agent.do_agent_update_v2",
|
||||
kwargs={
|
||||
"inno": inno,
|
||||
"url": url,
|
||||
},
|
||||
)
|
||||
sleep(10)
|
||||
|
||||
|
||||
@@ -107,14 +127,32 @@ def auto_self_agent_update_task():
|
||||
logger.info(
|
||||
f"Updating {agent.salt_id} current version {agent.version} using {inno}"
|
||||
)
|
||||
r = agent.salt_api_async(
|
||||
func="win_agent.do_agent_update_v2",
|
||||
kwargs={
|
||||
"inno": inno,
|
||||
"url": url,
|
||||
},
|
||||
)
|
||||
logger.info(f"{agent.salt_id}: {r}")
|
||||
|
||||
if agent.has_nats:
|
||||
if agent.pendingactions.filter(
|
||||
action_type="agentupdate", status="pending"
|
||||
).exists():
|
||||
continue
|
||||
|
||||
PendingAction.objects.create(
|
||||
agent=agent,
|
||||
action_type="agentupdate",
|
||||
details={
|
||||
"url": agent.winagent_dl,
|
||||
"version": settings.LATEST_AGENT_VER,
|
||||
"inno": agent.win_inno_exe,
|
||||
},
|
||||
)
|
||||
# TODO
|
||||
# Salt is deprecated, remove this once salt is gone
|
||||
else:
|
||||
r = agent.salt_api_async(
|
||||
func="win_agent.do_agent_update_v2",
|
||||
kwargs={
|
||||
"inno": inno,
|
||||
"url": url,
|
||||
},
|
||||
)
|
||||
sleep(10)
|
||||
|
||||
|
||||
@@ -140,7 +178,11 @@ def update_salt_minion_task():
|
||||
@app.task
|
||||
def get_wmi_detail_task(pk):
|
||||
agent = Agent.objects.get(pk=pk)
|
||||
r = agent.salt_api_async(timeout=30, func="win_agent.local_sys_info")
|
||||
if agent.has_nats:
|
||||
asyncio.run(agent.nats_cmd({"func": "sysinfo"}, wait=False))
|
||||
else:
|
||||
agent.salt_api_async(timeout=30, func="win_agent.local_sys_info")
|
||||
|
||||
return "ok"
|
||||
|
||||
|
||||
@@ -160,7 +202,7 @@ def sync_salt_modules_task(pk):
|
||||
def batch_sync_modules_task():
|
||||
# sync modules, split into chunks of 50 agents to not overload salt
|
||||
agents = Agent.objects.all()
|
||||
online = [i.salt_id for i in agents if i.status == "online"]
|
||||
online = [i.salt_id for i in agents]
|
||||
chunks = (online[i : i + 50] for i in range(0, len(online), 50))
|
||||
for chunk in chunks:
|
||||
Agent.salt_batch_async(minions=chunk, func="saltutil.sync_modules")
|
||||
@@ -171,15 +213,19 @@ def batch_sync_modules_task():
|
||||
def batch_sysinfo_task():
|
||||
# update system info using WMI
|
||||
agents = Agent.objects.all()
|
||||
online = [
|
||||
i.salt_id
|
||||
for i in agents
|
||||
if not i.not_supported("0.11.0") and i.status == "online"
|
||||
|
||||
agents_nats = [agent for agent in agents if agent.has_nats]
|
||||
minions = [
|
||||
agent.salt_id
|
||||
for agent in agents
|
||||
if not agent.has_nats and pyver.parse(agent.version) >= pyver.parse("0.11.0")
|
||||
]
|
||||
chunks = (online[i : i + 30] for i in range(0, len(online), 30))
|
||||
for chunk in chunks:
|
||||
Agent.salt_batch_async(minions=chunk, func="win_agent.local_sys_info")
|
||||
sleep(10)
|
||||
|
||||
if minions:
|
||||
Agent.salt_batch_async(minions=minions, func="win_agent.local_sys_info")
|
||||
|
||||
for agent in agents_nats:
|
||||
asyncio.run(agent.nats_cmd({"func": "sysinfo"}, wait=False))
|
||||
|
||||
|
||||
@app.task
|
||||
|
||||
@@ -7,9 +7,7 @@ from itertools import cycle
|
||||
|
||||
from django.conf import settings
|
||||
from django.utils import timezone as djangotime
|
||||
from rest_framework.authtoken.models import Token
|
||||
|
||||
from accounts.models import User
|
||||
from tacticalrmm.test import TacticalTestCase
|
||||
from .serializers import AgentSerializer
|
||||
from winupdate.serializers import WinUpdatePolicySerializer
|
||||
@@ -34,7 +32,9 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
client = baker.make("clients.Client", name="Google")
|
||||
site = baker.make("clients.Site", client=client, name="LA Office")
|
||||
self.agent = baker.make_recipe("agents.online_agent", site=site)
|
||||
self.agent = baker.make_recipe(
|
||||
"agents.online_agent", site=site, version="1.1.0"
|
||||
)
|
||||
baker.make_recipe("winupdate.winupdate_policy", agent=self.agent)
|
||||
|
||||
def test_get_patch_policy(self):
|
||||
@@ -81,29 +81,29 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_ping(self, mock_ret):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_ping(self, nats_cmd):
|
||||
url = f"/agents/{self.agent.pk}/ping/"
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
nats_cmd.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
ret = {"name": self.agent.hostname, "status": "offline"}
|
||||
self.assertEqual(r.json(), ret)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
nats_cmd.return_value = "natsdown"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
ret = {"name": self.agent.hostname, "status": "offline"}
|
||||
self.assertEqual(r.json(), ret)
|
||||
|
||||
mock_ret.return_value = True
|
||||
nats_cmd.return_value = "pong"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
ret = {"name": self.agent.hostname, "status": "online"}
|
||||
self.assertEqual(r.json(), ret)
|
||||
|
||||
mock_ret.return_value = False
|
||||
nats_cmd.return_value = "asdasjdaksdasd"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
ret = {"name": self.agent.hostname, "status": "offline"}
|
||||
@@ -111,39 +111,23 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
@patch("agents.tasks.uninstall_agent_task.delay")
|
||||
def test_uninstall(self, mock_task):
|
||||
@patch("agents.views.reload_nats")
|
||||
def test_uninstall(self, reload_nats, mock_task, nats_cmd):
|
||||
url = "/agents/uninstall/"
|
||||
data = {"pk": self.agent.pk}
|
||||
|
||||
r = self.client.delete(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
nats_cmd.assert_called_with({"func": "uninstall"}, wait=False)
|
||||
reload_nats.assert_called_once()
|
||||
mock_task.assert_called_with(self.agent.salt_id)
|
||||
|
||||
self.check_not_authenticated("delete", url)
|
||||
|
||||
@patch("agents.tasks.uninstall_agent_task.delay")
|
||||
def test_uninstall_catch_no_user(self, mock_task):
|
||||
# setup data
|
||||
agent_user = User.objects.create_user(
|
||||
username=self.agent.agent_id, password=User.objects.make_random_password(60)
|
||||
)
|
||||
agent_token = Token.objects.create(user=agent_user)
|
||||
|
||||
url = "/agents/uninstall/"
|
||||
data = {"pk": self.agent.pk}
|
||||
|
||||
agent_user.delete()
|
||||
|
||||
r = self.client.delete(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
mock_task.assert_called_with(self.agent.salt_id)
|
||||
|
||||
self.check_not_authenticated("delete", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_get_processes(self, mock_ret):
|
||||
url = f"/agents/{self.agent.pk}/getprocs/"
|
||||
|
||||
@@ -163,82 +147,61 @@ class TestAgentViews(TacticalTestCase):
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_kill_proc(self, mock_ret):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_kill_proc(self, nats_cmd):
|
||||
url = f"/agents/{self.agent.pk}/8234/killproc/"
|
||||
|
||||
mock_ret.return_value = True
|
||||
nats_cmd.return_value = "ok"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
mock_ret.return_value = False
|
||||
nats_cmd.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
nats_cmd.return_value = "process doesn't exist"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_get_event_log(self, mock_ret):
|
||||
url = f"/agents/{self.agent.pk}/geteventlog/Application/30/"
|
||||
|
||||
with open(
|
||||
os.path.join(settings.BASE_DIR, "tacticalrmm/test_data/eventlograw.json")
|
||||
os.path.join(settings.BASE_DIR, "tacticalrmm/test_data/appeventlog.json")
|
||||
) as f:
|
||||
mock_ret.return_value = json.load(f)
|
||||
|
||||
with open(
|
||||
os.path.join(settings.BASE_DIR, "tacticalrmm/test_data/appeventlog.json")
|
||||
) as f:
|
||||
decoded = json.load(f)
|
||||
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
self.assertEqual(decoded, r.json())
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_power_action(self, mock_ret):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_power_action(self, nats_cmd):
|
||||
url = f"/agents/poweraction/"
|
||||
|
||||
data = {"pk": self.agent.pk, "action": "rebootnow"}
|
||||
mock_ret.return_value = True
|
||||
nats_cmd.return_value = "ok"
|
||||
r = self.client.post(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
nats_cmd.assert_called_with({"func": "rebootnow"}, timeout=10)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
r = self.client.post(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = False
|
||||
nats_cmd.return_value = "timeout"
|
||||
r = self.client.post(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_send_raw_cmd(self, mock_ret):
|
||||
url = f"/agents/sendrawcmd/"
|
||||
|
||||
@@ -257,10 +220,6 @@ class TestAgentViews(TacticalTestCase):
|
||||
r = self.client.post(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = False
|
||||
r = self.client.post(url, data, format="json")
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@@ -569,12 +528,14 @@ class TestAgentViews(TacticalTestCase):
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("winupdate.tasks.bulk_check_for_updates_task.delay")
|
||||
@patch("scripts.tasks.handle_bulk_script_task.delay")
|
||||
@patch("scripts.tasks.handle_bulk_command_task.delay")
|
||||
@patch("agents.models.Agent.salt_batch_async")
|
||||
def test_bulk_cmd_script(self, mock_ret, mock_update):
|
||||
def test_bulk_cmd_script(
|
||||
self, salt_batch_async, bulk_command, bulk_script, mock_update
|
||||
):
|
||||
url = "/agents/bulk/"
|
||||
|
||||
mock_ret.return_value = "ok"
|
||||
|
||||
payload = {
|
||||
"mode": "command",
|
||||
"target": "agents",
|
||||
@@ -589,6 +550,7 @@ class TestAgentViews(TacticalTestCase):
|
||||
}
|
||||
|
||||
r = self.client.post(url, payload, format="json")
|
||||
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
payload = {
|
||||
@@ -620,6 +582,7 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
r = self.client.post(url, payload, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300)
|
||||
|
||||
payload = {
|
||||
"mode": "command",
|
||||
@@ -636,12 +599,7 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
r = self.client.post(url, payload, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
payload["client"] = self.agent.client.id
|
||||
payload["site"] = self.agent.site.id
|
||||
r = self.client.post(url, payload, format="json")
|
||||
self.assertEqual(r.status_code, 400)
|
||||
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300)
|
||||
|
||||
payload = {
|
||||
"mode": "scan",
|
||||
@@ -652,9 +610,8 @@ class TestAgentViews(TacticalTestCase):
|
||||
self.agent.pk,
|
||||
],
|
||||
}
|
||||
mock_ret.return_value = "ok"
|
||||
r = self.client.post(url, payload, format="json")
|
||||
mock_update.assert_called_once()
|
||||
mock_update.assert_called_with(minions=[self.agent.salt_id])
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
payload = {
|
||||
@@ -666,6 +623,7 @@ class TestAgentViews(TacticalTestCase):
|
||||
self.agent.pk,
|
||||
],
|
||||
}
|
||||
salt_batch_async.return_value = "ok"
|
||||
r = self.client.post(url, payload, format="json")
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
@@ -681,41 +639,18 @@ class TestAgentViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_restart_mesh(self, mock_ret):
|
||||
url = f"/agents/{self.agent.pk}/restartmesh/"
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = False
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = True
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_recover_mesh(self, mock_ret):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_recover_mesh(self, nats_cmd):
|
||||
url = f"/agents/{self.agent.pk}/recovermesh/"
|
||||
mock_ret.return_value = True
|
||||
nats_cmd.return_value = "ok"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 200)
|
||||
self.assertIn(self.agent.hostname, r.data)
|
||||
nats_cmd.assert_called_with(
|
||||
{"func": "recover", "payload": {"mode": "mesh"}}, timeout=45
|
||||
)
|
||||
|
||||
mock_ret.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
mock_ret.return_value = "error"
|
||||
nats_cmd.return_value = "timeout"
|
||||
r = self.client.get(url)
|
||||
self.assertEqual(r.status_code, 400)
|
||||
|
||||
@@ -804,13 +739,19 @@ class TestAgentTasks(TacticalTestCase):
|
||||
self.authenticate()
|
||||
self.setup_coresettings()
|
||||
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
@patch("agents.models.Agent.salt_api_async", return_value=None)
|
||||
def test_get_wmi_detail_task(self, salt_api_async):
|
||||
self.agent = baker.make_recipe("agents.agent")
|
||||
ret = get_wmi_detail_task.s(self.agent.pk).apply()
|
||||
def test_get_wmi_detail_task(self, salt_api_async, nats_cmd):
|
||||
self.agent_salt = baker.make_recipe("agents.agent", version="1.0.2")
|
||||
ret = get_wmi_detail_task.s(self.agent_salt.pk).apply()
|
||||
salt_api_async.assert_called_with(timeout=30, func="win_agent.local_sys_info")
|
||||
self.assertEqual(ret.status, "SUCCESS")
|
||||
|
||||
self.agent_nats = baker.make_recipe("agents.agent", version="1.1.0")
|
||||
ret = get_wmi_detail_task.s(self.agent_nats.pk).apply()
|
||||
nats_cmd.assert_called_with({"func": "sysinfo"}, wait=False)
|
||||
self.assertEqual(ret.status, "SUCCESS")
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_sync_salt_modules_task(self, salt_api_cmd):
|
||||
self.agent = baker.make_recipe("agents.agent")
|
||||
@@ -833,7 +774,7 @@ class TestAgentTasks(TacticalTestCase):
|
||||
@patch("agents.models.Agent.salt_batch_async", return_value=None)
|
||||
@patch("agents.tasks.sleep", return_value=None)
|
||||
def test_batch_sync_modules_task(self, mock_sleep, salt_batch_async):
|
||||
# chunks of 50, 60 online should run only 2 times
|
||||
# chunks of 50, should run 4 times
|
||||
baker.make_recipe(
|
||||
"agents.online_agent", last_seen=djangotime.now(), _quantity=60
|
||||
)
|
||||
@@ -843,32 +784,41 @@ class TestAgentTasks(TacticalTestCase):
|
||||
_quantity=115,
|
||||
)
|
||||
ret = batch_sync_modules_task.s().apply()
|
||||
self.assertEqual(salt_batch_async.call_count, 2)
|
||||
self.assertEqual(salt_batch_async.call_count, 4)
|
||||
self.assertEqual(ret.status, "SUCCESS")
|
||||
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
@patch("agents.models.Agent.salt_batch_async", return_value=None)
|
||||
@patch("agents.tasks.sleep", return_value=None)
|
||||
def test_batch_sysinfo_task(self, mock_sleep, salt_batch_async):
|
||||
# chunks of 30, 70 online should run only 3 times
|
||||
self.online = baker.make_recipe(
|
||||
"agents.online_agent", version=settings.LATEST_AGENT_VER, _quantity=70
|
||||
)
|
||||
self.overdue = baker.make_recipe(
|
||||
"agents.overdue_agent", version=settings.LATEST_AGENT_VER, _quantity=115
|
||||
def test_batch_sysinfo_task(self, mock_sleep, salt_batch_async, nats_cmd):
|
||||
|
||||
self.agents_nats = baker.make_recipe(
|
||||
"agents.agent", version="1.1.0", _quantity=20
|
||||
)
|
||||
# test nats
|
||||
ret = batch_sysinfo_task.s().apply()
|
||||
self.assertEqual(salt_batch_async.call_count, 3)
|
||||
self.assertEqual(nats_cmd.call_count, 20)
|
||||
nats_cmd.assert_called_with({"func": "sysinfo"}, wait=False)
|
||||
self.assertEqual(ret.status, "SUCCESS")
|
||||
|
||||
self.agents_salt = baker.make_recipe(
|
||||
"agents.agent", version="1.0.2", _quantity=70
|
||||
)
|
||||
|
||||
minions = [i.salt_id for i in self.agents_salt]
|
||||
|
||||
ret = batch_sysinfo_task.s().apply()
|
||||
self.assertEqual(salt_batch_async.call_count, 1)
|
||||
salt_batch_async.assert_called_with(
|
||||
minions=minions, func="win_agent.local_sys_info"
|
||||
)
|
||||
self.assertEqual(ret.status, "SUCCESS")
|
||||
salt_batch_async.reset_mock()
|
||||
[i.delete() for i in self.online]
|
||||
[i.delete() for i in self.overdue]
|
||||
[i.delete() for i in self.agents_salt]
|
||||
|
||||
# test old agents, should not run
|
||||
self.online_old = baker.make_recipe(
|
||||
"agents.online_agent", version="0.10.2", _quantity=70
|
||||
)
|
||||
self.overdue_old = baker.make_recipe(
|
||||
"agents.overdue_agent", version="0.10.2", _quantity=115
|
||||
self.agents_old = baker.make_recipe(
|
||||
"agents.agent", version="0.10.2", _quantity=70
|
||||
)
|
||||
ret = batch_sysinfo_task.s().apply()
|
||||
salt_batch_async.assert_not_called()
|
||||
|
||||
@@ -25,7 +25,6 @@ urlpatterns = [
|
||||
path("<int:pk>/ping/", views.ping),
|
||||
path("recover/", views.recover),
|
||||
path("runscript/", views.run_script),
|
||||
path("<int:pk>/restartmesh/", views.restart_mesh),
|
||||
path("<int:pk>/recovermesh/", views.recover_mesh),
|
||||
path("<int:pk>/notes/", views.GetAddNotes.as_view()),
|
||||
path("<int:pk>/note/", views.GetEditDeleteNote.as_view()),
|
||||
|
||||
@@ -1,9 +1,7 @@
|
||||
import asyncio
|
||||
from loguru import logger
|
||||
import os
|
||||
import subprocess
|
||||
import zlib
|
||||
import json
|
||||
import base64
|
||||
import pytz
|
||||
import datetime as dt
|
||||
from packaging import version as pyver
|
||||
@@ -18,9 +16,6 @@ from rest_framework.response import Response
|
||||
from rest_framework import status, generics
|
||||
|
||||
from .models import Agent, AgentOutage, RecoveryAction, Note
|
||||
from winupdate.models import WinUpdatePolicy
|
||||
from clients.models import Client, Site
|
||||
from accounts.models import User
|
||||
from core.models import CoreSettings
|
||||
from scripts.models import Script
|
||||
from logs.models import AuditLog
|
||||
@@ -37,9 +32,9 @@ from winupdate.serializers import WinUpdatePolicySerializer
|
||||
|
||||
from .tasks import uninstall_agent_task, send_agent_update_task
|
||||
from winupdate.tasks import bulk_check_for_updates_task
|
||||
from scripts.tasks import run_script_bg_task, run_bulk_script_task
|
||||
from scripts.tasks import handle_bulk_command_task, handle_bulk_script_task
|
||||
|
||||
from tacticalrmm.utils import notify_error
|
||||
from tacticalrmm.utils import notify_error, reload_nats
|
||||
|
||||
logger.configure(**settings.LOG_CONFIG)
|
||||
|
||||
@@ -66,31 +61,30 @@ def update_agents(request):
|
||||
@api_view()
|
||||
def ping(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(timeout=5, func="test.ping")
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
r = asyncio.run(agent.nats_cmd({"func": "ping"}, timeout=10))
|
||||
|
||||
if r == "timeout" or r == "error":
|
||||
if r == "timeout" or r == "natsdown":
|
||||
return Response({"name": agent.hostname, "status": "offline"})
|
||||
|
||||
if isinstance(r, bool) and r:
|
||||
elif r == "pong":
|
||||
return Response({"name": agent.hostname, "status": "online"})
|
||||
else:
|
||||
return Response({"name": agent.hostname, "status": "offline"})
|
||||
|
||||
return Response({"name": agent.hostname, "status": "offline"})
|
||||
|
||||
|
||||
@api_view(["DELETE"])
|
||||
def uninstall(request):
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
# just in case agent-user gets deleted accidentaly from django-admin
|
||||
# we can still remove the agent
|
||||
try:
|
||||
user = User.objects.get(username=agent.agent_id)
|
||||
user.delete()
|
||||
except Exception as e:
|
||||
logger.warning(e)
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
asyncio.run(agent.nats_cmd({"func": "uninstall"}, wait=False))
|
||||
|
||||
salt_id = agent.salt_id
|
||||
name = agent.hostname
|
||||
agent.delete()
|
||||
reload_nats()
|
||||
|
||||
uninstall_agent_task.delay(salt_id)
|
||||
return Response(f"{name} will now be uninstalled.")
|
||||
@@ -160,12 +154,11 @@ def agent_detail(request, pk):
|
||||
@api_view()
|
||||
def get_processes(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(timeout=20, func="win_agent.get_procs")
|
||||
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
r = asyncio.run(agent.nats_cmd(data={"func": "procs"}, timeout=5))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error":
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
return Response(r)
|
||||
|
||||
@@ -173,15 +166,17 @@ def get_processes(request, pk):
|
||||
@api_view()
|
||||
def kill_proc(request, pk, pid):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(timeout=25, func="ps.kill_pid", arg=int(pid))
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
r = asyncio.run(
|
||||
agent.nats_cmd({"func": "killproc", "procpid": int(pid)}, timeout=15)
|
||||
)
|
||||
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error":
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
if isinstance(r, bool) and not r:
|
||||
return notify_error("Unable to kill the process")
|
||||
elif r != "ok":
|
||||
return notify_error(r)
|
||||
|
||||
return Response("ok")
|
||||
|
||||
@@ -189,33 +184,32 @@ def kill_proc(request, pk, pid):
|
||||
@api_view()
|
||||
def get_event_log(request, pk, logtype, days):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=30,
|
||||
func="win_agent.get_eventlog",
|
||||
arg=[logtype, int(days)],
|
||||
)
|
||||
|
||||
if r == "timeout" or r == "error":
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
data = {
|
||||
"func": "eventlog",
|
||||
"timeout": 30,
|
||||
"payload": {
|
||||
"logname": logtype,
|
||||
"days": str(days),
|
||||
},
|
||||
}
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=32))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
|
||||
return Response(json.loads(zlib.decompress(base64.b64decode(r["wineventlog"]))))
|
||||
return Response(r)
|
||||
|
||||
|
||||
@api_view(["POST"])
|
||||
def power_action(request):
|
||||
pk = request.data["pk"]
|
||||
action = request.data["action"]
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
if action == "rebootnow":
|
||||
logger.info(f"{agent.hostname} was scheduled for immediate reboot")
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=30,
|
||||
func="system.reboot",
|
||||
arg=3,
|
||||
kwargs={"in_seconds": True},
|
||||
)
|
||||
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
|
||||
return notify_error("Unable to contact the agent")
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
if request.data["action"] == "rebootnow":
|
||||
r = asyncio.run(agent.nats_cmd({"func": "rebootnow"}, timeout=10))
|
||||
if r != "ok":
|
||||
return notify_error("Unable to contact the agent")
|
||||
|
||||
return Response("ok")
|
||||
|
||||
@@ -223,21 +217,21 @@ def power_action(request):
|
||||
@api_view(["POST"])
|
||||
def send_raw_cmd(request):
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=request.data["timeout"],
|
||||
func="cmd.run",
|
||||
kwargs={
|
||||
"cmd": request.data["cmd"],
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
timeout = int(request.data["timeout"])
|
||||
data = {
|
||||
"func": "rawcmd",
|
||||
"timeout": timeout,
|
||||
"payload": {
|
||||
"command": request.data["cmd"],
|
||||
"shell": request.data["shell"],
|
||||
"timeout": request.data["timeout"],
|
||||
},
|
||||
)
|
||||
}
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=timeout + 2))
|
||||
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error" or not r:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
AuditLog.audit_raw_command(
|
||||
username=request.user.username,
|
||||
@@ -246,7 +240,6 @@ def send_raw_cmd(request):
|
||||
shell=request.data["shell"],
|
||||
)
|
||||
|
||||
logger.info(f"The command {request.data['cmd']} was sent on agent {agent.hostname}")
|
||||
return Response(r)
|
||||
|
||||
|
||||
@@ -643,35 +636,60 @@ def install_agent(request):
|
||||
@api_view(["POST"])
|
||||
def recover(request):
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
mode = request.data["mode"]
|
||||
|
||||
if pyver.parse(agent.version) <= pyver.parse("0.9.5"):
|
||||
return notify_error("Only available in agent version greater than 0.9.5")
|
||||
|
||||
if not agent.has_nats:
|
||||
if mode == "tacagent" or mode == "checkrunner" or mode == "rpc":
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
# attempt a realtime recovery if supported, otherwise fall back to old recovery method
|
||||
if agent.has_nats:
|
||||
if (
|
||||
mode == "tacagent"
|
||||
or mode == "checkrunner"
|
||||
or mode == "salt"
|
||||
or mode == "mesh"
|
||||
):
|
||||
data = {"func": "recover", "payload": {"mode": mode}}
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=10))
|
||||
if r == "ok":
|
||||
return Response("Successfully completed recovery")
|
||||
|
||||
if agent.recoveryactions.filter(last_run=None).exists():
|
||||
return notify_error(
|
||||
"A recovery action is currently pending. Please wait for the next agent check-in."
|
||||
)
|
||||
|
||||
if request.data["mode"] == "command" and not request.data["cmd"]:
|
||||
if mode == "command" and not request.data["cmd"]:
|
||||
return notify_error("Command is required")
|
||||
|
||||
# if we've made it this far and realtime recovery didn't work,
|
||||
# tacagent service is the fallback recovery so we obv can't use that to recover itself if it's down
|
||||
if mode == "tacagent":
|
||||
return notify_error(
|
||||
"Requires RPC service to be functional. Please recover that first"
|
||||
)
|
||||
|
||||
# we should only get here if all other methods fail
|
||||
RecoveryAction(
|
||||
agent=agent,
|
||||
mode=request.data["mode"],
|
||||
command=request.data["cmd"] if request.data["mode"] == "command" else None,
|
||||
mode=mode,
|
||||
command=request.data["cmd"] if mode == "command" else None,
|
||||
).save()
|
||||
|
||||
return Response(f"Recovery will be attempted on the agent's next check-in")
|
||||
return Response("Recovery will be attempted on the agent's next check-in")
|
||||
|
||||
|
||||
@api_view(["POST"])
|
||||
def run_script(request):
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
script = get_object_or_404(Script, pk=request.data["scriptPK"])
|
||||
|
||||
output = request.data["output"]
|
||||
args = request.data["args"]
|
||||
|
||||
req_timeout = int(request.data["timeout"]) + 3
|
||||
|
||||
AuditLog.audit_script_run(
|
||||
@@ -680,75 +698,33 @@ def run_script(request):
|
||||
script=script.name,
|
||||
)
|
||||
|
||||
data = {
|
||||
"func": "runscript",
|
||||
"timeout": request.data["timeout"],
|
||||
"script_args": request.data["args"],
|
||||
"payload": {
|
||||
"code": script.code,
|
||||
"shell": script.shell,
|
||||
},
|
||||
}
|
||||
|
||||
if output == "wait":
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=req_timeout,
|
||||
func="win_agent.run_script",
|
||||
kwargs={
|
||||
"filepath": script.filepath,
|
||||
"filename": script.filename,
|
||||
"shell": script.shell,
|
||||
"timeout": request.data["timeout"],
|
||||
"args": args,
|
||||
},
|
||||
)
|
||||
|
||||
if isinstance(r, dict):
|
||||
if r["stdout"]:
|
||||
return Response(r["stdout"])
|
||||
elif r["stderr"]:
|
||||
return Response(r["stderr"])
|
||||
else:
|
||||
try:
|
||||
r["retcode"]
|
||||
except KeyError:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
return Response(f"Return code: {r['retcode']}")
|
||||
|
||||
else:
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error":
|
||||
return notify_error("Something went wrong")
|
||||
else:
|
||||
return notify_error(str(r))
|
||||
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=req_timeout))
|
||||
return Response(r)
|
||||
else:
|
||||
data = {
|
||||
"agentpk": agent.pk,
|
||||
"scriptpk": script.pk,
|
||||
"timeout": request.data["timeout"],
|
||||
"args": args,
|
||||
}
|
||||
run_script_bg_task.delay(data)
|
||||
asyncio.run(agent.nats_cmd(data, wait=False))
|
||||
return Response(f"{script.name} will now be run on {agent.hostname}")
|
||||
|
||||
|
||||
@api_view()
|
||||
def restart_mesh(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(func="service.restart", arg="mesh agent", timeout=30)
|
||||
if r == "timeout" or r == "error":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif isinstance(r, bool) and r:
|
||||
return Response(f"Restarted Mesh Agent on {agent.hostname}")
|
||||
else:
|
||||
return notify_error(f"Failed to restart the Mesh Agent on {agent.hostname}")
|
||||
|
||||
|
||||
@api_view()
|
||||
def recover_mesh(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=60,
|
||||
func="cmd.run",
|
||||
kwargs={
|
||||
"cmd": r'"C:\\Program Files\\TacticalAgent\\tacticalrmm.exe" -m recovermesh',
|
||||
"timeout": 55,
|
||||
},
|
||||
)
|
||||
if r == "timeout" or r == "error":
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
data = {"func": "recover", "payload": {"mode": "mesh"}}
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=45))
|
||||
if r != "ok":
|
||||
return notify_error("Unable to contact the agent")
|
||||
|
||||
return Response(f"Repaired mesh agent on {agent.hostname}")
|
||||
@@ -812,71 +788,44 @@ def bulk(request):
|
||||
return notify_error("Must select at least 1 agent")
|
||||
|
||||
if request.data["target"] == "client":
|
||||
agents = Agent.objects.filter(site__client_id=request.data["client"])
|
||||
q = Agent.objects.filter(site__client_id=request.data["client"])
|
||||
elif request.data["target"] == "site":
|
||||
agents = Agent.objects.filter(site_id=request.data["site"])
|
||||
q = Agent.objects.filter(site_id=request.data["site"])
|
||||
elif request.data["target"] == "agents":
|
||||
agents = Agent.objects.filter(pk__in=request.data["agentPKs"])
|
||||
q = Agent.objects.filter(pk__in=request.data["agentPKs"])
|
||||
elif request.data["target"] == "all":
|
||||
agents = Agent.objects.all()
|
||||
q = Agent.objects.all()
|
||||
else:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
minions = [agent.salt_id for agent in agents]
|
||||
minions = [agent.salt_id for agent in q]
|
||||
agents = [agent.pk for agent in q]
|
||||
|
||||
AuditLog.audit_bulk_action(request.user, request.data["mode"], request.data)
|
||||
|
||||
if request.data["mode"] == "command":
|
||||
r = Agent.salt_batch_async(
|
||||
minions=minions,
|
||||
func="cmd.run_bg",
|
||||
kwargs={
|
||||
"cmd": request.data["cmd"],
|
||||
"shell": request.data["shell"],
|
||||
"timeout": request.data["timeout"],
|
||||
},
|
||||
handle_bulk_command_task.delay(
|
||||
agents, request.data["cmd"], request.data["shell"], request.data["timeout"]
|
||||
)
|
||||
if r == "timeout":
|
||||
return notify_error("Salt API not running")
|
||||
return Response(f"Command will now be run on {len(minions)} agents")
|
||||
return Response(f"Command will now be run on {len(agents)} agents")
|
||||
|
||||
elif request.data["mode"] == "script":
|
||||
script = get_object_or_404(Script, pk=request.data["scriptPK"])
|
||||
|
||||
if script.shell == "python":
|
||||
r = Agent.salt_batch_async(
|
||||
minions=minions,
|
||||
func="win_agent.run_script",
|
||||
kwargs={
|
||||
"filepath": script.filepath,
|
||||
"filename": script.filename,
|
||||
"shell": script.shell,
|
||||
"timeout": request.data["timeout"],
|
||||
"args": request.data["args"],
|
||||
"bg": True,
|
||||
},
|
||||
)
|
||||
if r == "timeout":
|
||||
return notify_error("Salt API not running")
|
||||
else:
|
||||
data = {
|
||||
"minions": minions,
|
||||
"scriptpk": script.pk,
|
||||
"timeout": request.data["timeout"],
|
||||
"args": request.data["args"],
|
||||
}
|
||||
run_bulk_script_task.delay(data)
|
||||
|
||||
return Response(f"{script.name} will now be run on {len(minions)} agents")
|
||||
handle_bulk_script_task.delay(
|
||||
script.pk, agents, request.data["args"], request.data["timeout"]
|
||||
)
|
||||
return Response(f"{script.name} will now be run on {len(agents)} agents")
|
||||
|
||||
elif request.data["mode"] == "install":
|
||||
r = Agent.salt_batch_async(minions=minions, func="win_agent.install_updates")
|
||||
if r == "timeout":
|
||||
return notify_error("Salt API not running")
|
||||
return Response(
|
||||
f"Pending updates will now be installed on {len(minions)} agents"
|
||||
f"Pending updates will now be installed on {len(agents)} agents"
|
||||
)
|
||||
elif request.data["mode"] == "scan":
|
||||
bulk_check_for_updates_task.delay(minions=minions)
|
||||
return Response(f"Patch status scan will now run on {len(minions)} agents")
|
||||
return Response(f"Patch status scan will now run on {len(agents)} agents")
|
||||
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import os
|
||||
import requests
|
||||
from loguru import logger
|
||||
@@ -6,6 +7,7 @@ from django.conf import settings
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.utils import timezone as djangotime
|
||||
from django.http import HttpResponse
|
||||
from rest_framework import serializers
|
||||
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.views import APIView
|
||||
@@ -32,7 +34,7 @@ from agents.tasks import (
|
||||
from winupdate.tasks import check_for_updates_task
|
||||
from software.tasks import get_installed_software, install_chocolatey
|
||||
from checks.utils import bytes2human
|
||||
from tacticalrmm.utils import notify_error
|
||||
from tacticalrmm.utils import notify_error, reload_nats
|
||||
|
||||
logger.configure(**settings.LOG_CONFIG)
|
||||
|
||||
@@ -96,6 +98,17 @@ class Hello(APIView):
|
||||
recovery.save(update_fields=["last_run"])
|
||||
return Response(recovery.send())
|
||||
|
||||
# handle agent update
|
||||
if agent.pendingactions.filter(
|
||||
action_type="agentupdate", status="pending"
|
||||
).exists():
|
||||
update = agent.pendingactions.filter(
|
||||
action_type="agentupdate", status="pending"
|
||||
).last()
|
||||
update.status = "completed"
|
||||
update.save(update_fields=["status"])
|
||||
return Response(update.details)
|
||||
|
||||
# get any pending actions
|
||||
if agent.pendingactions.filter(status="pending").exists():
|
||||
agent.handle_pending_actions()
|
||||
@@ -132,8 +145,6 @@ class CheckRunner(APIView):
|
||||
|
||||
def get(self, request, agentid):
|
||||
agent = get_object_or_404(Agent, agent_id=agentid)
|
||||
agent.last_seen = djangotime.now()
|
||||
agent.save(update_fields=["last_seen"])
|
||||
checks = Check.objects.filter(agent__pk=agent.pk, overriden_by_policy=False)
|
||||
|
||||
ret = {
|
||||
@@ -144,10 +155,23 @@ class CheckRunner(APIView):
|
||||
return Response(ret)
|
||||
|
||||
def patch(self, request):
|
||||
from logs.models import AuditLog
|
||||
|
||||
check = get_object_or_404(Check, pk=request.data["id"])
|
||||
check.last_run = djangotime.now()
|
||||
check.save(update_fields=["last_run"])
|
||||
status = check.handle_checkv2(request.data)
|
||||
|
||||
# create audit entry
|
||||
AuditLog.objects.create(
|
||||
username=check.agent.hostname,
|
||||
agent=check.agent.hostname,
|
||||
object_type="agent",
|
||||
action="check_run",
|
||||
message=f"{check.readable_desc} was run on {check.agent.hostname}. Status: {status}",
|
||||
after_value=Check.serialize(check),
|
||||
)
|
||||
|
||||
return Response(status)
|
||||
|
||||
|
||||
@@ -165,6 +189,8 @@ class TaskRunner(APIView):
|
||||
return Response(TaskGOGetSerializer(task).data)
|
||||
|
||||
def patch(self, request, pk, agentid):
|
||||
from logs.models import AuditLog
|
||||
|
||||
agent = get_object_or_404(Agent, agent_id=agentid)
|
||||
task = get_object_or_404(AutomatedTask, pk=pk)
|
||||
|
||||
@@ -173,6 +199,17 @@ class TaskRunner(APIView):
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
serializer.save(last_run=djangotime.now())
|
||||
|
||||
new_task = AutomatedTask.objects.get(pk=task.pk)
|
||||
AuditLog.objects.create(
|
||||
username=agent.hostname,
|
||||
agent=agent.hostname,
|
||||
object_type="agent",
|
||||
action="task_run",
|
||||
message=f"Scheduled Task {task.name} was run on {agent.hostname}",
|
||||
after_value=AutomatedTask.serialize(new_task),
|
||||
)
|
||||
|
||||
return Response("ok")
|
||||
|
||||
|
||||
@@ -306,21 +343,16 @@ class WinUpdater(APIView):
|
||||
agent.save(update_fields=["needs_reboot"])
|
||||
|
||||
if reboot:
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=15,
|
||||
func="system.reboot",
|
||||
arg=7,
|
||||
kwargs={"in_seconds": True},
|
||||
)
|
||||
|
||||
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
|
||||
check_for_updates_task.apply_async(
|
||||
queue="wupdate", kwargs={"pk": agent.pk, "wait": False}
|
||||
)
|
||||
if agent.has_nats:
|
||||
asyncio.run(agent.nats_cmd({"func": "rebootnow"}, wait=False))
|
||||
else:
|
||||
logger.info(
|
||||
f"{agent.hostname} is rebooting after updates were installed."
|
||||
agent.salt_api_async(
|
||||
func="system.reboot",
|
||||
arg=7,
|
||||
kwargs={"in_seconds": True},
|
||||
)
|
||||
|
||||
logger.info(f"{agent.hostname} is rebooting after updates were installed.")
|
||||
else:
|
||||
check_for_updates_task.apply_async(
|
||||
queue="wupdate", kwargs={"pk": agent.pk, "wait": False}
|
||||
@@ -385,31 +417,9 @@ class MeshExe(APIView):
|
||||
|
||||
|
||||
class NewAgent(APIView):
|
||||
""" For the installer """
|
||||
|
||||
def post(self, request):
|
||||
"""
|
||||
Creates and returns the agents auth token
|
||||
which is stored in the agent's local db
|
||||
and used to authenticate every agent request
|
||||
"""
|
||||
from logs.models import AuditLog
|
||||
|
||||
if "agent_id" not in request.data:
|
||||
return notify_error("Invalid payload")
|
||||
|
||||
agentid = request.data["agent_id"]
|
||||
if Agent.objects.filter(agent_id=agentid).exists():
|
||||
return notify_error(
|
||||
"Agent already exists. Remove old agent first if trying to re-install"
|
||||
)
|
||||
|
||||
user = User.objects.create_user(
|
||||
username=agentid, password=User.objects.make_random_password(60)
|
||||
)
|
||||
token = Token.objects.create(user=user)
|
||||
return Response({"token": token.key})
|
||||
|
||||
def patch(self, request):
|
||||
""" Creates the agent """
|
||||
|
||||
if Agent.objects.filter(agent_id=request.data["agent_id"]).exists():
|
||||
@@ -430,13 +440,39 @@ class NewAgent(APIView):
|
||||
agent.salt_id = f"{agent.hostname}-{agent.pk}"
|
||||
agent.save(update_fields=["salt_id"])
|
||||
|
||||
user = User.objects.create_user(
|
||||
username=request.data["agent_id"],
|
||||
agent=agent,
|
||||
password=User.objects.make_random_password(60),
|
||||
)
|
||||
|
||||
token = Token.objects.create(user=user)
|
||||
|
||||
if agent.monitoring_type == "workstation":
|
||||
WinUpdatePolicy(agent=agent, run_time_days=[5, 6]).save()
|
||||
else:
|
||||
WinUpdatePolicy(agent=agent).save()
|
||||
|
||||
reload_nats()
|
||||
|
||||
# Generate policies for new agent
|
||||
agent.generate_checks_from_policies()
|
||||
agent.generate_tasks_from_policies()
|
||||
|
||||
return Response({"pk": agent.pk, "saltid": f"{agent.hostname}-{agent.pk}"})
|
||||
# create agent install audit record
|
||||
AuditLog.objects.create(
|
||||
username=request.user,
|
||||
agent=agent.hostname,
|
||||
object_type="agent",
|
||||
action="agent_install",
|
||||
message=f"{request.user} installed new agent {agent.hostname}",
|
||||
after_value=Agent.serialize(agent),
|
||||
)
|
||||
|
||||
return Response(
|
||||
{
|
||||
"pk": agent.pk,
|
||||
"saltid": f"{agent.hostname}-{agent.pk}",
|
||||
"token": token.key,
|
||||
}
|
||||
)
|
||||
|
||||
@@ -206,6 +206,7 @@ def delete_win_task_schedule(pk, pending_action=False):
|
||||
|
||||
@app.task
|
||||
def run_win_task(pk):
|
||||
# TODO deprecated, remove this function once salt gone
|
||||
task = AutomatedTask.objects.get(pk=pk)
|
||||
r = task.agent.salt_api_async(func="task.run", arg=[f"name={task.win_task_name}"])
|
||||
return "ok"
|
||||
|
||||
@@ -181,10 +181,10 @@ class TestAutotaskViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("delete", url)
|
||||
|
||||
@patch("autotasks.tasks.run_win_task.delay")
|
||||
def test_run_autotask(self, run_win_task):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_run_autotask(self, nats_cmd):
|
||||
# setup data
|
||||
agent = baker.make_recipe("agents.agent")
|
||||
agent = baker.make_recipe("agents.agent", version="1.1.0")
|
||||
task = baker.make("autotasks.AutomatedTask", agent=agent)
|
||||
|
||||
# test invalid url
|
||||
@@ -195,7 +195,15 @@ class TestAutotaskViews(TacticalTestCase):
|
||||
url = f"/tasks/runwintask/{task.id}/"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
run_win_task.assert_called_with(task.id)
|
||||
nats_cmd.assert_called_with({"func": "runtask", "taskpk": task.id}, wait=False)
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
old_agent = baker.make_recipe("agents.agent", version="1.0.2")
|
||||
task2 = baker.make("autotasks.AutomatedTask", agent=old_agent)
|
||||
url = f"/tasks/runwintask/{task2.id}/"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
nats_cmd.assert_not_called()
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import pytz
|
||||
from django.shortcuts import get_object_or_404
|
||||
|
||||
@@ -17,7 +18,6 @@ from .serializers import TaskSerializer, AutoTaskSerializer
|
||||
from .tasks import (
|
||||
create_win_task_schedule,
|
||||
delete_win_task_schedule,
|
||||
run_win_task,
|
||||
enable_or_disable_win_task,
|
||||
)
|
||||
from tacticalrmm.utils import notify_error
|
||||
@@ -114,5 +114,8 @@ class AutoTask(APIView):
|
||||
@api_view()
|
||||
def run_task(request, pk):
|
||||
task = get_object_or_404(AutomatedTask, pk=pk)
|
||||
run_win_task.delay(task.pk)
|
||||
if not task.agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
asyncio.run(task.agent.nats_cmd({"func": "runtask", "taskpk": task.pk}, wait=False))
|
||||
return Response(f"{task.name} will now be run on {task.agent.hostname}")
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
import base64
|
||||
import asyncio
|
||||
import string
|
||||
import os
|
||||
import json
|
||||
import pytz
|
||||
import zlib
|
||||
from statistics import mean
|
||||
|
||||
from django.db import models
|
||||
@@ -306,12 +305,16 @@ class Check(BaseAuditModel):
|
||||
self.status = "passing"
|
||||
else:
|
||||
if self.agent and self.restart_if_stopped:
|
||||
r = self.agent.salt_api_cmd(
|
||||
func="service.restart", arg=self.svc_name, timeout=45
|
||||
)
|
||||
if r == "timeout" or r == "error":
|
||||
nats_data = {
|
||||
"func": "winsvcaction",
|
||||
"payload": {"name": self.svc_name, "action": "start"},
|
||||
}
|
||||
r = asyncio.run(self.agent.nats_cmd(nats_data, timeout=32))
|
||||
if r == "timeout" or r == "natsdown":
|
||||
self.status = "failing"
|
||||
elif isinstance(r, bool) and r:
|
||||
elif not r["success"] and r["errormsg"]:
|
||||
self.status = "failing"
|
||||
elif r["success"]:
|
||||
self.status = "passing"
|
||||
self.more_info = f"Status RUNNING"
|
||||
else:
|
||||
@@ -336,8 +339,7 @@ class Check(BaseAuditModel):
|
||||
eventID = self.event_id
|
||||
source = self.event_source
|
||||
message = self.event_message
|
||||
|
||||
r = json.loads(zlib.decompress(base64.b64decode(data["log"])))
|
||||
r = data["log"]
|
||||
|
||||
for i in r:
|
||||
if i["eventType"] == eventType:
|
||||
|
||||
@@ -56,10 +56,3 @@ def handle_check_sms_alert_task(pk):
|
||||
check.save(update_fields=["text_sent"])
|
||||
|
||||
return "ok"
|
||||
|
||||
|
||||
@app.task
|
||||
def run_checks_task(pk):
|
||||
agent = Agent.objects.get(pk=pk)
|
||||
agent.salt_api_async(func="win_agent.run_manual_checks")
|
||||
return "ok"
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import asyncio
|
||||
|
||||
from django.shortcuts import get_object_or_404
|
||||
|
||||
from rest_framework.views import APIView
|
||||
@@ -13,7 +15,6 @@ from scripts.models import Script
|
||||
|
||||
from .serializers import CheckSerializer
|
||||
|
||||
from .tasks import run_checks_task
|
||||
|
||||
from automation.tasks import (
|
||||
generate_agent_checks_from_policies_task,
|
||||
@@ -178,7 +179,10 @@ class GetUpdateDeleteCheck(APIView):
|
||||
@api_view()
|
||||
def run_checks(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
run_checks_task.delay(agent.pk)
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
asyncio.run(agent.nats_cmd({"func": "runchecks"}, wait=False))
|
||||
return Response(agent.hostname)
|
||||
|
||||
|
||||
|
||||
@@ -55,8 +55,7 @@ class Client(BaseAuditModel):
|
||||
return True
|
||||
|
||||
if agent.overdue_email_alert or agent.overdue_text_alert:
|
||||
if agent.status == "overdue":
|
||||
return True
|
||||
return agent.status == "overdue"
|
||||
|
||||
return False
|
||||
|
||||
@@ -116,8 +115,7 @@ class Site(BaseAuditModel):
|
||||
return True
|
||||
|
||||
if agent.overdue_email_alert or agent.overdue_text_alert:
|
||||
if agent.status == "overdue":
|
||||
return True
|
||||
return agent.status == "overdue"
|
||||
|
||||
return False
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@ from django.conf import settings
|
||||
from core.models import CoreSettings
|
||||
from .helpers import get_auth_token
|
||||
import asyncio
|
||||
import ssl
|
||||
import websockets
|
||||
import json
|
||||
|
||||
@@ -11,15 +10,15 @@ import json
|
||||
class Command(BaseCommand):
|
||||
help = "Sets up initial mesh central configuration"
|
||||
|
||||
async def websocket_call(self):
|
||||
async def websocket_call(self, mesh_settings):
|
||||
token = get_auth_token(
|
||||
self.mesh_settings.mesh_username, self.mesh_settings.mesh_token
|
||||
mesh_settings.mesh_username, mesh_settings.mesh_token
|
||||
)
|
||||
|
||||
if settings.MESH_WS_URL:
|
||||
uri = f"{settings.MESH_WS_URL}/control.ashx?auth={token}"
|
||||
else:
|
||||
site = self.mesh_settings.mesh_site.replace("https", "wss")
|
||||
site = mesh_settings.mesh_site.replace("https", "wss")
|
||||
uri = f"{site}/control.ashx?auth={token}"
|
||||
|
||||
async with websockets.connect(uri) as websocket:
|
||||
@@ -45,5 +44,5 @@ class Command(BaseCommand):
|
||||
break
|
||||
|
||||
def handle(self, *args, **kwargs):
|
||||
self.mesh_settings = CoreSettings.objects.first()
|
||||
asyncio.get_event_loop().run_until_complete(self.websocket_call())
|
||||
mesh_settings = CoreSettings.objects.first()
|
||||
asyncio.get_event_loop().run_until_complete(self.websocket_call(mesh_settings))
|
||||
|
||||
@@ -1,53 +1,83 @@
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.conf import settings
|
||||
from core.models import CoreSettings
|
||||
from .helpers import get_auth_token
|
||||
import asyncio
|
||||
import websockets
|
||||
import json
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Sets up initial mesh central configuration"
|
||||
|
||||
async def websocket_call(self):
|
||||
|
||||
token = get_auth_token(
|
||||
self.mesh_settings.mesh_username, self.mesh_settings.mesh_token
|
||||
)
|
||||
|
||||
if settings.MESH_WS_URL:
|
||||
uri = f"{settings.MESH_WS_URL}/control.ashx?auth={token}"
|
||||
else:
|
||||
site = self.mesh_settings.mesh_site.replace("https", "wss")
|
||||
uri = f"{site}/control.ashx?auth={token}"
|
||||
|
||||
async with websockets.connect(uri) as websocket:
|
||||
|
||||
# Get Device groups to see if it exists
|
||||
await websocket.send(json.dumps({"action": "meshes"}))
|
||||
|
||||
async for message in websocket:
|
||||
response = json.loads(message)
|
||||
if response["action"] == "meshes":
|
||||
|
||||
# If no meshes are present
|
||||
if not response["meshes"]:
|
||||
await websocket.send(
|
||||
json.dumps(
|
||||
{
|
||||
"action": "createmesh",
|
||||
"meshname": "TacticalRMM",
|
||||
"meshtype": 2,
|
||||
"responseid": "python",
|
||||
}
|
||||
)
|
||||
)
|
||||
break
|
||||
else:
|
||||
break
|
||||
|
||||
def handle(self, *args, **kwargs):
|
||||
self.mesh_settings = CoreSettings.objects.first()
|
||||
asyncio.get_event_loop().run_until_complete(self.websocket_call())
|
||||
self.stdout.write("Initial Mesh Central setup complete")
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.conf import settings
|
||||
from core.models import CoreSettings
|
||||
from .helpers import get_auth_token
|
||||
import asyncio
|
||||
import websockets
|
||||
import json
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Sets up initial mesh central configuration"
|
||||
|
||||
async def websocket_call(self, mesh_settings):
|
||||
|
||||
token = get_auth_token(
|
||||
mesh_settings.mesh_username, mesh_settings.mesh_token
|
||||
)
|
||||
|
||||
if settings.MESH_WS_URL:
|
||||
uri = f"{settings.MESH_WS_URL}/control.ashx?auth={token}"
|
||||
else:
|
||||
site = mesh_settings.mesh_site.replace("https", "wss")
|
||||
uri = f"{site}/control.ashx?auth={token}"
|
||||
|
||||
async with websockets.connect(uri) as websocket:
|
||||
|
||||
# Get Device groups to see if it exists
|
||||
await websocket.send(json.dumps({"action": "meshes"}))
|
||||
|
||||
async for message in websocket:
|
||||
response = json.loads(message)
|
||||
if response["action"] == "meshes":
|
||||
|
||||
# If no meshes are present
|
||||
if not response["meshes"]:
|
||||
await websocket.send(
|
||||
json.dumps(
|
||||
{
|
||||
"action": "createmesh",
|
||||
"meshname": "TacticalRMM",
|
||||
"meshtype": 2,
|
||||
"responseid": "python",
|
||||
}
|
||||
)
|
||||
)
|
||||
break
|
||||
else:
|
||||
break
|
||||
|
||||
def handle(self, *args, **kwargs):
|
||||
mesh_settings = CoreSettings.objects.first()
|
||||
|
||||
try:
|
||||
# Check for Mesh Username
|
||||
if not mesh_settings.mesh_username or settings.MESH_USERNAME != mesh_settings.mesh_username:
|
||||
mesh_settings.mesh_username = settings.MESH_USERNAME
|
||||
|
||||
# Check for Mesh Site
|
||||
if not mesh_settings.mesh_site or settings.MESH_SITE != mesh_settings.mesh_site:
|
||||
mesh_settings.mesh_site = settings.MESH_SITE
|
||||
|
||||
# Check for Mesh Token
|
||||
if (
|
||||
not mesh_settings.mesh_token
|
||||
or settings.MESH_TOKEN_KEY != mesh_settings.mesh_token
|
||||
):
|
||||
mesh_settings.mesh_token = settings.MESH_TOKEN_KEY
|
||||
|
||||
mesh_settings.save()
|
||||
|
||||
except AttributeError:
|
||||
self.stdout.write(
|
||||
"Mesh Setup was skipped because the configuration wasn't available. Needs to be setup manually."
|
||||
)
|
||||
return
|
||||
|
||||
try:
|
||||
asyncio.get_event_loop().run_until_complete(self.websocket_call(mesh_settings))
|
||||
self.stdout.write("Initial Mesh Central setup complete")
|
||||
except websockets.exceptions.ConnectionClosedError:
|
||||
self.stdout.write(
|
||||
"Unable to connect to MeshCentral. Please verify it is online and the configuration is correct in the settings."
|
||||
)
|
||||
|
||||
9
api/tacticalrmm/core/management/commands/reload_nats.py
Normal file
9
api/tacticalrmm/core/management/commands/reload_nats.py
Normal file
@@ -0,0 +1,9 @@
|
||||
from django.core.management.base import BaseCommand
|
||||
from tacticalrmm.utils import reload_nats
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Reload Nats"
|
||||
|
||||
def handle(self, *args, **kwargs):
|
||||
reload_nats()
|
||||
@@ -72,16 +72,14 @@ class CoreSettings(BaseAuditModel):
|
||||
if not self.pk and CoreSettings.objects.exists():
|
||||
raise ValidationError("There can only be one CoreSettings instance")
|
||||
|
||||
# Only runs on first create
|
||||
# for install script
|
||||
if not self.pk:
|
||||
mesh_settings = self.get_initial_mesh_settings()
|
||||
|
||||
if "mesh_token" in mesh_settings:
|
||||
self.mesh_token = mesh_settings["mesh_token"]
|
||||
if "mesh_username" in mesh_settings:
|
||||
self.mesh_username = mesh_settings["mesh_username"]
|
||||
if "mesh_site" in mesh_settings:
|
||||
self.mesh_site = mesh_settings["mesh_site"]
|
||||
try:
|
||||
self.mesh_site = settings.MESH_SITE
|
||||
self.mesh_username = settings.MESH_USERNAME
|
||||
self.mesh_token = settings.MESH_TOKEN_KEY
|
||||
except:
|
||||
pass
|
||||
|
||||
return super(CoreSettings, self).save(*args, **kwargs)
|
||||
|
||||
@@ -121,8 +119,8 @@ class CoreSettings(BaseAuditModel):
|
||||
and self.smtp_port
|
||||
):
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
def send_mail(self, subject, body, test=False):
|
||||
|
||||
@@ -168,60 +166,9 @@ class CoreSettings(BaseAuditModel):
|
||||
except Exception as e:
|
||||
logger.error(f"SMS failed to send: {e}")
|
||||
|
||||
def get_initial_mesh_settings(self):
|
||||
|
||||
mesh_settings = {}
|
||||
|
||||
# Check for Mesh Username
|
||||
try:
|
||||
if settings.MESH_USERNAME:
|
||||
mesh_settings["mesh_username"] = settings.MESH_USERNAME
|
||||
else:
|
||||
raise AttributeError("MESH_USERNAME doesn't exist")
|
||||
except AttributeError:
|
||||
pass
|
||||
|
||||
# Check for Mesh Site
|
||||
try:
|
||||
if settings.MESH_SITE:
|
||||
mesh_settings["mesh_site"] = settings.MESH_SITE
|
||||
else:
|
||||
raise AttributeError("MESH_SITE doesn't exist")
|
||||
except AttributeError:
|
||||
pass
|
||||
|
||||
# Check for Mesh Token
|
||||
try:
|
||||
if settings.MESH_TOKEN_KEY:
|
||||
mesh_settings["mesh_token"] = settings.MESH_TOKEN_KEY
|
||||
else:
|
||||
raise AttributeError("MESH_TOKEN_KEY doesn't exist")
|
||||
except AttributeError:
|
||||
filepath = "/token/token.key"
|
||||
counter = 0
|
||||
while counter < 12:
|
||||
try:
|
||||
with open(filepath, "r") as read_file:
|
||||
key = read_file.readlines()
|
||||
|
||||
# Remove key file contents for security reasons
|
||||
with open(filepath, "w") as write_file:
|
||||
write_file.write("")
|
||||
|
||||
# readlines() returns an array. Get first item
|
||||
mesh_settings["mesh_token"] = key[0].rstrip()
|
||||
break
|
||||
except (IOError, IndexError):
|
||||
pass
|
||||
|
||||
counter = counter + 1
|
||||
time.sleep(10)
|
||||
|
||||
return mesh_settings
|
||||
|
||||
@staticmethod
|
||||
def serialize(core):
|
||||
# serializes the core and returns json
|
||||
from .serializers import CoreSerializer
|
||||
|
||||
return CoreSerializer(core).data
|
||||
return CoreSerializer(core).data
|
||||
|
||||
@@ -4,8 +4,6 @@ from loguru import logger
|
||||
from django.conf import settings
|
||||
from django.utils import timezone as djangotime
|
||||
from tacticalrmm.celery import app
|
||||
from accounts.models import User
|
||||
from agents.models import Agent
|
||||
from autotasks.models import AutomatedTask
|
||||
from autotasks.tasks import delete_win_task_schedule
|
||||
|
||||
@@ -14,15 +12,6 @@ logger.configure(**settings.LOG_CONFIG)
|
||||
|
||||
@app.task
|
||||
def core_maintenance_tasks():
|
||||
# cleanup any leftover agent user accounts
|
||||
agents = Agent.objects.values_list("agent_id", flat=True)
|
||||
users = User.objects.exclude(username__in=agents).filter(last_login=None)
|
||||
if users:
|
||||
users.delete()
|
||||
logger.info(
|
||||
"Removed leftover agent user accounts:", str([i.username for i in users])
|
||||
)
|
||||
|
||||
# cleanup expired runonce tasks
|
||||
tasks = AutomatedTask.objects.filter(
|
||||
task_type="runonce",
|
||||
|
||||
@@ -68,7 +68,9 @@ def version(request):
|
||||
|
||||
@api_view()
|
||||
def dashboard_info(request):
|
||||
return Response({"trmm_version": settings.TRMM_VERSION})
|
||||
return Response(
|
||||
{"trmm_version": settings.TRMM_VERSION, "dark_mode": request.user.dark_mode}
|
||||
)
|
||||
|
||||
|
||||
@api_view()
|
||||
|
||||
18
api/tacticalrmm/logs/migrations/0008_auto_20201110_1431.py
Normal file
18
api/tacticalrmm/logs/migrations/0008_auto_20201110_1431.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.2 on 2020-11-10 14:31
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('logs', '0007_auditlog_debug_info'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='auditlog',
|
||||
name='action',
|
||||
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
|
||||
),
|
||||
]
|
||||
18
api/tacticalrmm/logs/migrations/0009_auto_20201110_1431.py
Normal file
18
api/tacticalrmm/logs/migrations/0009_auto_20201110_1431.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.2 on 2020-11-10 14:31
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('logs', '0008_auto_20201110_1431'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='auditlog',
|
||||
name='action',
|
||||
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
|
||||
),
|
||||
]
|
||||
23
api/tacticalrmm/logs/migrations/0010_auto_20201110_2238.py
Normal file
23
api/tacticalrmm/logs/migrations/0010_auto_20201110_2238.py
Normal file
@@ -0,0 +1,23 @@
|
||||
# Generated by Django 3.1.2 on 2020-11-10 22:38
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('logs', '0009_auto_20201110_1431'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='auditlog',
|
||||
name='action',
|
||||
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command'), ('bulk_action', 'Bulk Action')], max_length=100),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='auditlog',
|
||||
name='object_type',
|
||||
field=models.CharField(choices=[('user', 'User'), ('script', 'Script'), ('agent', 'Agent'), ('policy', 'Policy'), ('winupdatepolicy', 'Patch Policy'), ('client', 'Client'), ('site', 'Site'), ('check', 'Check'), ('automatedtask', 'Automated Task'), ('coresettings', 'Core Settings'), ('bulk', 'Bulk')], max_length=100),
|
||||
),
|
||||
]
|
||||
18
api/tacticalrmm/logs/migrations/0011_auto_20201119_0854.py
Normal file
18
api/tacticalrmm/logs/migrations/0011_auto_20201119_0854.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 3.1.3 on 2020-11-19 08:54
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('logs', '0010_auto_20201110_2238'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='pendingaction',
|
||||
name='action_type',
|
||||
field=models.CharField(blank=True, choices=[('schedreboot', 'Scheduled Reboot'), ('taskaction', 'Scheduled Task Action'), ('agentupdate', 'Agent Update')], max_length=255, null=True),
|
||||
),
|
||||
]
|
||||
@@ -1,4 +1,5 @@
|
||||
import datetime as dt
|
||||
import json
|
||||
from abc import abstractmethod
|
||||
from django.db import models
|
||||
from tacticalrmm.middleware import get_username, get_debug_info
|
||||
@@ -6,6 +7,7 @@ from tacticalrmm.middleware import get_username, get_debug_info
|
||||
ACTION_TYPE_CHOICES = [
|
||||
("schedreboot", "Scheduled Reboot"),
|
||||
("taskaction", "Scheduled Task Action"),
|
||||
("agentupdate", "Agent Update"),
|
||||
]
|
||||
|
||||
AUDIT_ACTION_TYPE_CHOICES = [
|
||||
@@ -15,9 +17,13 @@ AUDIT_ACTION_TYPE_CHOICES = [
|
||||
("modify", "Modify Object"),
|
||||
("add", "Add Object"),
|
||||
("view", "View Object"),
|
||||
("check_run", "Check Run"),
|
||||
("task_run", "Task Run"),
|
||||
("agent_install", "Agent Install"),
|
||||
("remote_session", "Remote Session"),
|
||||
("execute_script", "Execute Script"),
|
||||
("execute_command", "Execute Command"),
|
||||
("bulk_action", "Bulk Action"),
|
||||
]
|
||||
|
||||
AUDIT_OBJECT_TYPE_CHOICES = [
|
||||
@@ -31,6 +37,7 @@ AUDIT_OBJECT_TYPE_CHOICES = [
|
||||
("check", "Check"),
|
||||
("automatedtask", "Automated Task"),
|
||||
("coresettings", "Core Settings"),
|
||||
("bulk", "Bulk"),
|
||||
]
|
||||
|
||||
# taskaction details format
|
||||
@@ -170,6 +177,45 @@ class AuditLog(models.Model):
|
||||
debug_info=debug_info,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def audit_bulk_action(username, action, affected, debug_info={}):
|
||||
from clients.models import Client, Site
|
||||
from agents.models import Agent
|
||||
from scripts.models import Script
|
||||
|
||||
target = ""
|
||||
agents = None
|
||||
|
||||
if affected["target"] == "all":
|
||||
target = "on all agents"
|
||||
elif affected["target"] == "client":
|
||||
client = Client.objects.get(pk=affected["client"])
|
||||
target = f"on all agents within client: {client.name}"
|
||||
elif affected["target"] == "site":
|
||||
site = Site.objects.get(pk=affected["site"])
|
||||
target = f"on all agents within site: {site.client.name}\\{site.name}"
|
||||
elif affected["target"] == "agents":
|
||||
agents = Agent.objects.filter(pk__in=affected["agentPKs"]).values_list(
|
||||
"hostname", flat=True
|
||||
)
|
||||
target = "on multiple agents"
|
||||
|
||||
if action == "script":
|
||||
script = Script.objects.get(pk=affected["scriptPK"])
|
||||
action = f"script: {script.name}"
|
||||
|
||||
if agents:
|
||||
affected["agent_hostnames"] = list(agents)
|
||||
|
||||
AuditLog.objects.create(
|
||||
username=username,
|
||||
object_type="bulk",
|
||||
action="bulk_action",
|
||||
message=f"{username} executed bulk {action} {target}",
|
||||
debug_info=debug_info,
|
||||
after_value=affected,
|
||||
)
|
||||
|
||||
|
||||
class DebugLog(models.Model):
|
||||
pass
|
||||
@@ -203,7 +249,7 @@ class PendingAction(models.Model):
|
||||
obj = dt.datetime.strptime(self.details["time"], "%Y-%m-%d %H:%M:%S")
|
||||
return dt.datetime.strftime(obj, "%B %d, %Y at %I:%M %p")
|
||||
|
||||
elif self.action_type == "taskaction":
|
||||
elif self.action_type == "taskaction" or self.action_type == "agentupdate":
|
||||
return "Next agent check-in"
|
||||
|
||||
@property
|
||||
@@ -211,6 +257,9 @@ class PendingAction(models.Model):
|
||||
if self.action_type == "schedreboot":
|
||||
return "Device pending reboot"
|
||||
|
||||
elif self.action_type == "agentupdate":
|
||||
return f"Agent update to {self.details['version']}"
|
||||
|
||||
elif self.action_type == "taskaction":
|
||||
if self.details["action"] == "taskdelete":
|
||||
return "Device pending task deletion"
|
||||
@@ -246,28 +295,31 @@ class BaseAuditModel(models.Model):
|
||||
|
||||
before_value = {}
|
||||
object_class = type(self)
|
||||
object_name = object_class.__name__.lower()
|
||||
username = get_username()
|
||||
|
||||
# populate created_by and modified_by fields on instance
|
||||
if not getattr(self, "created_by", None):
|
||||
self.created_by = get_username()
|
||||
self.created_by = username
|
||||
if hasattr(self, "modified_by"):
|
||||
self.modified_by = get_username()
|
||||
self.modified_by = username
|
||||
|
||||
# capture object properties before edit
|
||||
if self.pk:
|
||||
before_value = object_class.objects.get(pk=self.id)
|
||||
|
||||
# dont create entry for agent add since that is done in view
|
||||
if not self.pk:
|
||||
AuditLog.audit_object_add(
|
||||
get_username(),
|
||||
object_class.__name__.lower(),
|
||||
username,
|
||||
object_name,
|
||||
object_class.serialize(self),
|
||||
self.__str__(),
|
||||
debug_info=get_debug_info(),
|
||||
)
|
||||
else:
|
||||
AuditLog.audit_object_changed(
|
||||
get_username(),
|
||||
username,
|
||||
object_class.__name__.lower(),
|
||||
object_class.serialize(before_value),
|
||||
object_class.serialize(self),
|
||||
@@ -280,6 +332,7 @@ class BaseAuditModel(models.Model):
|
||||
def delete(self, *args, **kwargs):
|
||||
|
||||
if get_username():
|
||||
|
||||
object_class = type(self)
|
||||
AuditLog.audit_object_delete(
|
||||
get_username(),
|
||||
|
||||
@@ -11,6 +11,10 @@ class TestAuditViews(TacticalTestCase):
|
||||
self.setup_coresettings()
|
||||
|
||||
def create_audit_records(self):
|
||||
|
||||
# create clients for client filter
|
||||
site = baker.make("clients.Site")
|
||||
baker.make_recipe("agents.agent", site=site, hostname="AgentHostname1")
|
||||
# user jim agent logs
|
||||
baker.make_recipe(
|
||||
"logs.agent_logs",
|
||||
@@ -75,11 +79,13 @@ class TestAuditViews(TacticalTestCase):
|
||||
_quantity=13,
|
||||
)
|
||||
|
||||
return site
|
||||
|
||||
def test_get_audit_logs(self):
|
||||
url = "/logs/auditlogs/"
|
||||
|
||||
# create data
|
||||
self.create_audit_records()
|
||||
site = self.create_audit_records()
|
||||
|
||||
# test data and result counts
|
||||
data = [
|
||||
@@ -111,6 +117,9 @@ class TestAuditViews(TacticalTestCase):
|
||||
"count": 40,
|
||||
},
|
||||
{"filter": {"timeFilter": 35, "userFilter": ["james", "jim"]}, "count": 81},
|
||||
{"filter": {"objectFilter": ["user"]}, "count": 26},
|
||||
{"filter": {"actionFilter": ["login"]}, "count": 12},
|
||||
{"filter": {"clientFilter": [site.client.id]}, "count": 23},
|
||||
]
|
||||
|
||||
for req in data:
|
||||
|
||||
@@ -4,6 +4,7 @@ from django.conf import settings
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.http import HttpResponse
|
||||
from django.utils import timezone as djangotime
|
||||
from django.db.models import Q
|
||||
from datetime import datetime as dt
|
||||
|
||||
from rest_framework.response import Response
|
||||
@@ -22,32 +23,52 @@ from .tasks import cancel_pending_action_task
|
||||
|
||||
class GetAuditLogs(APIView):
|
||||
def patch(self, request):
|
||||
from clients.models import Client
|
||||
from agents.models import Agent
|
||||
|
||||
auditLogs = None
|
||||
if "agentFilter" in request.data and "userFilter" in request.data:
|
||||
audit_logs = AuditLog.objects.filter(
|
||||
agent__in=request.data["agentFilter"],
|
||||
username__in=request.data["userFilter"],
|
||||
agentFilter = Q()
|
||||
clientFilter = Q()
|
||||
actionFilter = Q()
|
||||
objectFilter = Q()
|
||||
userFilter = Q()
|
||||
timeFilter = Q()
|
||||
|
||||
if "agentFilter" in request.data:
|
||||
agentFilter = Q(agent__in=request.data["agentFilter"])
|
||||
|
||||
elif "clientFilter" in request.data:
|
||||
clients = Client.objects.filter(
|
||||
pk__in=request.data["clientFilter"]
|
||||
).values_list("id")
|
||||
agents = Agent.objects.filter(site__client_id__in=clients).values_list(
|
||||
"hostname"
|
||||
)
|
||||
clientFilter = Q(agent__in=agents)
|
||||
|
||||
elif "userFilter" in request.data:
|
||||
audit_logs = AuditLog.objects.filter(
|
||||
username__in=request.data["userFilter"]
|
||||
)
|
||||
if "userFilter" in request.data:
|
||||
userFilter = Q(username__in=request.data["userFilter"])
|
||||
|
||||
elif "agentFilter" in request.data:
|
||||
audit_logs = AuditLog.objects.filter(agent__in=request.data["agentFilter"])
|
||||
if "actionFilter" in request.data:
|
||||
actionFilter = Q(action__in=request.data["actionFilter"])
|
||||
|
||||
else:
|
||||
audit_logs = AuditLog.objects.all()
|
||||
if "objectFilter" in request.data:
|
||||
objectFilter = Q(object_type__in=request.data["objectFilter"])
|
||||
|
||||
if audit_logs and "timeFilter" in request.data:
|
||||
audit_logs = audit_logs.filter(
|
||||
if "timeFilter" in request.data:
|
||||
timeFilter = Q(
|
||||
entry_time__lte=djangotime.make_aware(dt.today()),
|
||||
entry_time__gt=djangotime.make_aware(dt.today())
|
||||
- djangotime.timedelta(days=request.data["timeFilter"]),
|
||||
)
|
||||
|
||||
audit_logs = (
|
||||
AuditLog.objects.filter(agentFilter | clientFilter)
|
||||
.filter(userFilter)
|
||||
.filter(actionFilter)
|
||||
.filter(objectFilter)
|
||||
.filter(timeFilter)
|
||||
)
|
||||
|
||||
return Response(AuditLogSerializer(audit_logs, many=True).data)
|
||||
|
||||
|
||||
@@ -58,9 +79,8 @@ class FilterOptionsAuditLog(APIView):
|
||||
return Response(AgentHostnameSerializer(agents, many=True).data)
|
||||
|
||||
if request.data["type"] == "user":
|
||||
agents = Agent.objects.values_list("agent_id", flat=True)
|
||||
users = User.objects.exclude(username__in=agents).filter(
|
||||
username__icontains=request.data["pattern"]
|
||||
users = User.objects.filter(
|
||||
username__icontains=request.data["pattern"], agent=None
|
||||
)
|
||||
return Response(UserSerializer(users, many=True).data)
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
amqp==2.6.1
|
||||
asgiref==3.3.0
|
||||
asyncio-nats-client==0.11.4
|
||||
billiard==3.6.3.0
|
||||
celery==4.4.6
|
||||
certifi==2020.11.8
|
||||
@@ -28,6 +29,7 @@ redis==3.5.3
|
||||
requests==2.24.0
|
||||
six==1.15.0
|
||||
sqlparse==0.4.1
|
||||
tldextract==3.0.2
|
||||
twilio==6.47.0
|
||||
urllib3==1.25.11
|
||||
uWSGI==2.0.19.1
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from django.db import models
|
||||
from logs.models import BaseAuditModel
|
||||
from django.conf import settings
|
||||
|
||||
SCRIPT_SHELLS = [
|
||||
("powershell", "Powershell"),
|
||||
@@ -38,9 +39,9 @@ class Script(BaseAuditModel):
|
||||
@property
|
||||
def file(self):
|
||||
if self.script_type == "userdefined":
|
||||
return f"/srv/salt/scripts/userdefined/{self.filename}"
|
||||
return f"{settings.SCRIPTS_DIR}/userdefined/{self.filename}"
|
||||
else:
|
||||
return f"/srv/salt/scripts/{self.filename}"
|
||||
return f"{settings.SCRIPTS_DIR}/{self.filename}"
|
||||
|
||||
@property
|
||||
def code(self):
|
||||
@@ -62,7 +63,13 @@ class Script(BaseAuditModel):
|
||||
# load community uploaded scripts into the database
|
||||
# skip ones that already exist, only updating name / desc in case it changes
|
||||
# files will be copied by the update script or in docker to /srv/salt/scripts
|
||||
scripts_dir = os.path.join(Path(settings.BASE_DIR).parents[1], "scripts")
|
||||
|
||||
# for install script
|
||||
if not settings.DOCKER_BUILD:
|
||||
scripts_dir = os.path.join(Path(settings.BASE_DIR).parents[1], "scripts")
|
||||
# for docker
|
||||
else:
|
||||
scripts_dir = settings.SCRIPTS_DIR
|
||||
|
||||
with open(
|
||||
os.path.join(settings.BASE_DIR, "scripts/community_scripts.json")
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import os
|
||||
|
||||
from django.conf import settings
|
||||
from rest_framework.serializers import ModelSerializer, ValidationError, ReadOnlyField
|
||||
from .models import Script
|
||||
|
||||
@@ -27,7 +28,7 @@ class ScriptSerializer(ModelSerializer):
|
||||
# but only if adding, not if editing since will overwrite if edit
|
||||
if not self.instance:
|
||||
script_path = os.path.join(
|
||||
"/srv/salt/scripts/userdefined", val["filename"]
|
||||
f"{settings.SCRIPTS_DIR}/userdefined", val["filename"]
|
||||
)
|
||||
if os.path.exists(script_path):
|
||||
raise ValidationError(
|
||||
|
||||
@@ -1,38 +1,73 @@
|
||||
import asyncio
|
||||
|
||||
from tacticalrmm.celery import app
|
||||
from agents.models import Agent
|
||||
from .models import Script
|
||||
from scripts.models import Script
|
||||
|
||||
|
||||
@app.task
|
||||
def run_script_bg_task(data):
|
||||
agent = Agent.objects.get(pk=data["agentpk"])
|
||||
script = Script.objects.get(pk=data["scriptpk"])
|
||||
def handle_bulk_command_task(agentpks, cmd, shell, timeout):
|
||||
agents = Agent.objects.filter(pk__in=agentpks)
|
||||
|
||||
agent.salt_api_async(
|
||||
func="win_agent.run_script",
|
||||
kwargs={
|
||||
"filepath": script.filepath,
|
||||
"filename": script.filename,
|
||||
"shell": script.shell,
|
||||
"timeout": data["timeout"],
|
||||
"args": data["args"],
|
||||
},
|
||||
)
|
||||
agents_nats = [agent for agent in agents if agent.has_nats]
|
||||
agents_salt = [agent for agent in agents if not agent.has_nats]
|
||||
minions = [agent.salt_id for agent in agents_salt]
|
||||
|
||||
if minions:
|
||||
Agent.salt_batch_async(
|
||||
minions=minions,
|
||||
func="cmd.run_bg",
|
||||
kwargs={
|
||||
"cmd": cmd,
|
||||
"shell": shell,
|
||||
"timeout": timeout,
|
||||
},
|
||||
)
|
||||
|
||||
if agents_nats:
|
||||
nats_data = {
|
||||
"func": "rawcmd",
|
||||
"timeout": timeout,
|
||||
"payload": {
|
||||
"command": cmd,
|
||||
"shell": shell,
|
||||
},
|
||||
}
|
||||
for agent in agents_nats:
|
||||
asyncio.run(agent.nats_cmd(nats_data, wait=False))
|
||||
|
||||
|
||||
@app.task
|
||||
def run_bulk_script_task(data):
|
||||
# for powershell and batch scripts only, workaround for salt bg script bug
|
||||
script = Script.objects.get(pk=data["scriptpk"])
|
||||
def handle_bulk_script_task(scriptpk, agentpks, args, timeout):
|
||||
script = Script.objects.get(pk=scriptpk)
|
||||
agents = Agent.objects.filter(pk__in=agentpks)
|
||||
|
||||
Agent.salt_batch_async(
|
||||
minions=data["minions"],
|
||||
func="win_agent.run_script",
|
||||
kwargs={
|
||||
"filepath": script.filepath,
|
||||
"filename": script.filename,
|
||||
agents_nats = [agent for agent in agents if agent.has_nats]
|
||||
agents_salt = [agent for agent in agents if not agent.has_nats]
|
||||
minions = [agent.salt_id for agent in agents_salt]
|
||||
|
||||
if minions:
|
||||
Agent.salt_batch_async(
|
||||
minions=minions,
|
||||
func="win_agent.run_script",
|
||||
kwargs={
|
||||
"filepath": script.filepath,
|
||||
"filename": script.filename,
|
||||
"shell": script.shell,
|
||||
"timeout": timeout,
|
||||
"args": args,
|
||||
"bg": True if script.shell == "python" else False, # salt bg script bug
|
||||
},
|
||||
)
|
||||
|
||||
nats_data = {
|
||||
"func": "runscript",
|
||||
"timeout": timeout,
|
||||
"script_args": args,
|
||||
"payload": {
|
||||
"code": script.code,
|
||||
"shell": script.shell,
|
||||
"timeout": data["timeout"],
|
||||
"args": data["args"],
|
||||
},
|
||||
)
|
||||
}
|
||||
for agent in agents_nats:
|
||||
asyncio.run(agent.nats_cmd(nats_data, wait=False))
|
||||
|
||||
@@ -94,7 +94,11 @@ class TestScriptViews(TacticalTestCase):
|
||||
|
||||
def test_load_community_scripts(self):
|
||||
valid_shells = ["powershell", "python", "cmd"]
|
||||
scripts_dir = os.path.join(Path(settings.BASE_DIR).parents[1], "scripts")
|
||||
|
||||
if not settings.DOCKER_BUILD:
|
||||
scripts_dir = os.path.join(Path(settings.BASE_DIR).parents[1], "scripts")
|
||||
else:
|
||||
scripts_dir = settings.SCRIPTS_DIR
|
||||
|
||||
with open(
|
||||
os.path.join(settings.BASE_DIR, "scripts/community_scripts.json")
|
||||
|
||||
@@ -32,8 +32,8 @@ class TestServiceViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_get_refreshed_services(self, salt_api_cmd):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_get_refreshed_services(self, nats_cmd):
|
||||
# test a call where agent doesn't exist
|
||||
resp = self.client.get("/services/500/refreshedservices/", format="json")
|
||||
self.assertEqual(resp.status_code, 404)
|
||||
@@ -41,7 +41,7 @@ class TestServiceViews(TacticalTestCase):
|
||||
agent = baker.make_recipe("agents.agent_with_services")
|
||||
url = f"/services/{agent.pk}/refreshedservices/"
|
||||
|
||||
salt_return = [
|
||||
nats_return = [
|
||||
{
|
||||
"pid": 880,
|
||||
"name": "AeLookupSvc",
|
||||
@@ -65,30 +65,23 @@ class TestServiceViews(TacticalTestCase):
|
||||
]
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
nats_cmd.return_value = "timeout"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(timeout=15, func="win_agent.get_services")
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "error"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(timeout=15, func="win_agent.get_services")
|
||||
salt_api_cmd.reset_mock()
|
||||
nats_cmd.assert_called_with(data={"func": "winservices"}, timeout=10)
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt
|
||||
salt_api_cmd.return_value = salt_return
|
||||
nats_cmd.return_value = nats_return
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(timeout=15, func="win_agent.get_services")
|
||||
self.assertEquals(Agent.objects.get(pk=agent.pk).services, salt_return)
|
||||
nats_cmd.assert_called_with(data={"func": "winservices"}, timeout=10)
|
||||
self.assertEquals(Agent.objects.get(pk=agent.pk).services, nats_return)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_service_action(self, salt_api_cmd):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_service_action(self, nats_cmd):
|
||||
url = "/services/serviceaction/"
|
||||
|
||||
invalid_data = {"pk": 500, "sv_name": "AeLookupSvc", "sv_action": "restart"}
|
||||
@@ -101,47 +94,37 @@ class TestServiceViews(TacticalTestCase):
|
||||
data = {"pk": agent.pk, "sv_name": "AeLookupSvc", "sv_action": "restart"}
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
nats_cmd.return_value = "timeout"
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=45,
|
||||
func=f"service.restart",
|
||||
arg="AeLookupSvc",
|
||||
nats_cmd.assert_called_with(
|
||||
{
|
||||
"func": "winsvcaction",
|
||||
"payload": {
|
||||
"name": "AeLookupSvc",
|
||||
"action": "stop",
|
||||
},
|
||||
},
|
||||
timeout=32,
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
salt_api_cmd.return_value = "error"
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=45,
|
||||
func=f"service.restart",
|
||||
arg="AeLookupSvc",
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt
|
||||
salt_api_cmd.return_value = True
|
||||
nats_cmd.return_value = {"success": True, "errormsg": ""}
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=45,
|
||||
func=f"service.restart",
|
||||
arg="AeLookupSvc",
|
||||
)
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_service_detail(self, salt_api_cmd):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_service_detail(self, nats_cmd):
|
||||
# test a call where agent doesn't exist
|
||||
resp = self.client.get(
|
||||
"/services/500/doesntexist/servicedetail/", format="json"
|
||||
)
|
||||
self.assertEqual(resp.status_code, 404)
|
||||
|
||||
salt_return = {
|
||||
nats_return = {
|
||||
"pid": 812,
|
||||
"name": "ALG",
|
||||
"status": "stopped",
|
||||
@@ -156,29 +139,27 @@ class TestServiceViews(TacticalTestCase):
|
||||
url = f"/services/{agent.pk}/alg/servicedetail/"
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
nats_cmd.return_value = "timeout"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(timeout=20, func="service.info", arg="alg")
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
salt_api_cmd.return_value = "error"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(timeout=20, func="service.info", arg="alg")
|
||||
salt_api_cmd.reset_mock()
|
||||
nats_cmd.assert_called_with(
|
||||
{"func": "winsvcdetail", "payload": {"name": "alg"}}, timeout=10
|
||||
)
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt
|
||||
salt_api_cmd.return_value = salt_return
|
||||
nats_cmd.return_value = nats_return
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(timeout=20, func="service.info", arg="alg")
|
||||
self.assertEquals(resp.data, salt_return)
|
||||
nats_cmd.assert_called_with(
|
||||
{"func": "winsvcdetail", "payload": {"name": "alg"}}, timeout=10
|
||||
)
|
||||
self.assertEquals(resp.data, nats_return)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_edit_service(self, salt_api_cmd):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_edit_service(self, nats_cmd):
|
||||
url = "/services/editservice/"
|
||||
agent = baker.make_recipe("agents.agent_with_services")
|
||||
|
||||
@@ -189,64 +170,43 @@ class TestServiceViews(TacticalTestCase):
|
||||
|
||||
data = {"pk": agent.pk, "sv_name": "AeLookupSvc", "edit_action": "autodelay"}
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
# test timeout
|
||||
nats_cmd.return_value = "timeout"
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=data["sv_name"],
|
||||
kwargs={"start_type": "auto", "start_delayed": True},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
salt_api_cmd.return_value = "error"
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=data["sv_name"],
|
||||
kwargs={"start_type": "auto", "start_delayed": True},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt autodelay
|
||||
salt_api_cmd.return_value = True
|
||||
nats_cmd.return_value = {"success": True, "errormsg": ""}
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=data["sv_name"],
|
||||
kwargs={"start_type": "auto", "start_delayed": True},
|
||||
nats_cmd.assert_called_with(
|
||||
{
|
||||
"func": "editwinsvc",
|
||||
"payload": {
|
||||
"name": "AeLookupSvc",
|
||||
"startType": "autodelay",
|
||||
},
|
||||
},
|
||||
timeout=10,
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt with auto
|
||||
# test error message from agent
|
||||
data = {"pk": agent.pk, "sv_name": "AeLookupSvc", "edit_action": "auto"}
|
||||
salt_api_cmd.return_value = True
|
||||
nats_cmd.return_value = {
|
||||
"success": False,
|
||||
"errormsg": "The parameter is incorrect",
|
||||
}
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=data["sv_name"],
|
||||
kwargs={"start_type": "auto", "start_delayed": False},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt with manual
|
||||
data = {"pk": agent.pk, "sv_name": "AeLookupSvc", "edit_action": "manual"}
|
||||
salt_api_cmd.return_value = True
|
||||
# test catch all
|
||||
data = {"pk": agent.pk, "sv_name": "AeLookupSvc", "edit_action": "auto"}
|
||||
nats_cmd.return_value = {"success": False, "errormsg": ""}
|
||||
resp = self.client.post(url, data, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=data["sv_name"],
|
||||
kwargs={"start_type": "manual"},
|
||||
)
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
self.assertEqual(resp.data, "Something went wrong")
|
||||
|
||||
self.check_not_authenticated("post", url)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
from loguru import logger
|
||||
|
||||
from rest_framework.response import Response
|
||||
@@ -30,12 +31,12 @@ def default_services(request):
|
||||
@api_view()
|
||||
def get_refreshed_services(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(timeout=15, func="win_agent.get_services")
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
r = asyncio.run(agent.nats_cmd(data={"func": "winservices"}, timeout=10))
|
||||
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error" or not r:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
agent.services = r
|
||||
agent.save(update_fields=["services"])
|
||||
@@ -44,64 +45,79 @@ def get_refreshed_services(request, pk):
|
||||
|
||||
@api_view(["POST"])
|
||||
def service_action(request):
|
||||
data = request.data
|
||||
pk = data["pk"]
|
||||
service_name = data["sv_name"]
|
||||
service_action = data["sv_action"]
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=45,
|
||||
func=f"service.{service_action}",
|
||||
arg=service_name,
|
||||
)
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
action = request.data["sv_action"]
|
||||
data = {
|
||||
"func": "winsvcaction",
|
||||
"payload": {
|
||||
"name": request.data["sv_name"],
|
||||
},
|
||||
}
|
||||
# response struct from agent: {success: bool, errormsg: string}
|
||||
if action == "restart":
|
||||
data["payload"]["action"] = "stop"
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=32))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif not r["success"] and r["errormsg"]:
|
||||
return notify_error(r["errormsg"])
|
||||
elif r["success"]:
|
||||
data["payload"]["action"] = "start"
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=32))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif not r["success"] and r["errormsg"]:
|
||||
return notify_error(r["errormsg"])
|
||||
elif r["success"]:
|
||||
return Response("ok")
|
||||
else:
|
||||
data["payload"]["action"] = action
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=32))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif not r["success"] and r["errormsg"]:
|
||||
return notify_error(r["errormsg"])
|
||||
elif r["success"]:
|
||||
return Response("ok")
|
||||
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error" or not r:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
return Response("ok")
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
|
||||
@api_view()
|
||||
def service_detail(request, pk, svcname):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(timeout=20, func="service.info", arg=svcname)
|
||||
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
data = {"func": "winsvcdetail", "payload": {"name": svcname}}
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=10))
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error" or not r:
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
return Response(r)
|
||||
|
||||
|
||||
@api_view(["POST"])
|
||||
def edit_service(request):
|
||||
data = request.data
|
||||
pk = data["pk"]
|
||||
service_name = data["sv_name"]
|
||||
edit_action = data["edit_action"]
|
||||
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
|
||||
if edit_action == "autodelay":
|
||||
kwargs = {"start_type": "auto", "start_delayed": True}
|
||||
elif edit_action == "auto":
|
||||
kwargs = {"start_type": "auto", "start_delayed": False}
|
||||
else:
|
||||
kwargs = {"start_type": edit_action}
|
||||
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=20,
|
||||
func="service.modify",
|
||||
arg=service_name,
|
||||
kwargs=kwargs,
|
||||
)
|
||||
agent = get_object_or_404(Agent, pk=request.data["pk"])
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
data = {
|
||||
"func": "editwinsvc",
|
||||
"payload": {
|
||||
"name": request.data["sv_name"],
|
||||
"startType": request.data["edit_action"],
|
||||
},
|
||||
}
|
||||
|
||||
r = asyncio.run(agent.nats_cmd(data, timeout=10))
|
||||
# response struct from agent: {success: bool, errormsg: string}
|
||||
if r == "timeout":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error" or not r:
|
||||
return notify_error("Something went wrong")
|
||||
elif not r["success"] and r["errormsg"]:
|
||||
return notify_error(r["errormsg"])
|
||||
elif r["success"]:
|
||||
return Response("ok")
|
||||
|
||||
return Response("ok")
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import string
|
||||
from time import sleep
|
||||
from loguru import logger
|
||||
@@ -89,35 +90,36 @@ def update_chocos():
|
||||
@app.task
|
||||
def get_installed_software(pk):
|
||||
agent = Agent.objects.get(pk=pk)
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=30,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
if not agent.has_nats:
|
||||
logger.error(f"{agent.salt_id} software list only available in agent >= 1.1.0")
|
||||
return
|
||||
|
||||
if r == "timeout" or r == "error":
|
||||
logger.error(f"Timed out trying to get installed software on {agent.salt_id}")
|
||||
r = asyncio.run(agent.nats_cmd({"func": "softwarelist"}, timeout=20))
|
||||
if r == "timeout" or r == "natsdown":
|
||||
logger.error(f"{agent.salt_id} {r}")
|
||||
return
|
||||
|
||||
printable = set(string.printable)
|
||||
|
||||
try:
|
||||
software = [
|
||||
sw = []
|
||||
for s in r:
|
||||
sw.append(
|
||||
{
|
||||
"name": "".join(filter(lambda x: x in printable, k)),
|
||||
"version": "".join(filter(lambda x: x in printable, v)),
|
||||
"name": "".join(filter(lambda x: x in printable, s["name"])),
|
||||
"version": "".join(filter(lambda x: x in printable, s["version"])),
|
||||
"publisher": "".join(filter(lambda x: x in printable, s["publisher"])),
|
||||
"install_date": s["install_date"],
|
||||
"size": s["size"],
|
||||
"source": s["source"],
|
||||
"location": s["location"],
|
||||
"uninstall": s["uninstall"],
|
||||
}
|
||||
for k, v in r.items()
|
||||
]
|
||||
except Exception as e:
|
||||
logger.error(f"Unable to get installed software on {agent.salt_id}: {e}")
|
||||
return
|
||||
)
|
||||
|
||||
if not InstalledSoftware.objects.filter(agent=agent).exists():
|
||||
InstalledSoftware(agent=agent, software=software).save()
|
||||
InstalledSoftware(agent=agent, software=sw).save()
|
||||
else:
|
||||
s = agent.installedsoftware_set.get()
|
||||
s.software = software
|
||||
s = agent.installedsoftware_set.first()
|
||||
s.software = sw
|
||||
s.save(update_fields=["software"])
|
||||
|
||||
return "ok"
|
||||
|
||||
@@ -62,72 +62,6 @@ class TestSoftwareViews(TacticalTestCase):
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_chocos_refresh(self, salt_api_cmd):
|
||||
|
||||
salt_return = {"git": "2.3.4", "docker": "1.0.2"}
|
||||
|
||||
# test a call where agent doesn't exist
|
||||
resp = self.client.get("/software/refresh/500/", format="json")
|
||||
self.assertEqual(resp.status_code, 404)
|
||||
|
||||
agent = baker.make_recipe("agents.agent")
|
||||
url = f"/software/refresh/{agent.pk}/"
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
salt_api_cmd.return_value = "error"
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
# test success and created new software object
|
||||
salt_api_cmd.return_value = salt_return
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
self.assertTrue(InstalledSoftware.objects.filter(agent=agent).exists())
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
# test success and updates software object
|
||||
salt_api_cmd.return_value = salt_return
|
||||
resp = self.client.get(url, format="json")
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=20,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
software = agent.installedsoftware_set.get()
|
||||
|
||||
expected = [
|
||||
{"name": "git", "version": "2.3.4"},
|
||||
{"name": "docker", "version": "1.0.2"},
|
||||
]
|
||||
|
||||
self.assertTrue(InstalledSoftware.objects.filter(agent=agent).exists())
|
||||
self.assertEquals(software.software, expected)
|
||||
|
||||
self.check_not_authenticated("get", url)
|
||||
|
||||
|
||||
class TestSoftwareTasks(TacticalTestCase):
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@@ -186,43 +120,57 @@ class TestSoftwareTasks(TacticalTestCase):
|
||||
salt_api_cmd.assert_any_call(timeout=200, func="chocolatey.list")
|
||||
self.assertEquals(salt_api_cmd.call_count, 2)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
def test_get_installed_software(self, salt_api_cmd):
|
||||
@patch("agents.models.Agent.nats_cmd")
|
||||
def test_get_installed_software(self, nats_cmd):
|
||||
from .tasks import get_installed_software
|
||||
|
||||
agent = baker.make_recipe("agents.agent")
|
||||
|
||||
salt_return = {"git": "2.3.4", "docker": "1.0.2"}
|
||||
|
||||
# test failed attempt
|
||||
salt_api_cmd.return_value = "timeout"
|
||||
ret = get_installed_software(agent.pk)
|
||||
self.assertFalse(ret)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=30,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
salt_api_cmd.reset_mock()
|
||||
|
||||
# test successful attempt
|
||||
salt_api_cmd.return_value = salt_return
|
||||
ret = get_installed_software(agent.pk)
|
||||
self.assertTrue(ret)
|
||||
salt_api_cmd.assert_called_with(
|
||||
timeout=30,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
software = agent.installedsoftware_set.get()
|
||||
|
||||
expected = [
|
||||
{"name": "git", "version": "2.3.4"},
|
||||
{"name": "docker", "version": "1.0.2"},
|
||||
nats_return = [
|
||||
{
|
||||
"name": "Mozilla Maintenance Service",
|
||||
"size": "336.9 kB",
|
||||
"source": "",
|
||||
"version": "73.0.1",
|
||||
"location": "",
|
||||
"publisher": "Mozilla",
|
||||
"uninstall": '"C:\\Program Files (x86)\\Mozilla Maintenance Service\\uninstall.exe"',
|
||||
"install_date": "0001-01-01 00:00:00 +0000 UTC",
|
||||
},
|
||||
{
|
||||
"name": "OpenVPN 2.4.9-I601-Win10 ",
|
||||
"size": "8.7 MB",
|
||||
"source": "",
|
||||
"version": "2.4.9-I601-Win10",
|
||||
"location": "C:\\Program Files\\OpenVPN\\",
|
||||
"publisher": "OpenVPN Technologies, Inc.",
|
||||
"uninstall": "C:\\Program Files\\OpenVPN\\Uninstall.exe",
|
||||
"install_date": "0001-01-01 00:00:00 +0000 UTC",
|
||||
},
|
||||
{
|
||||
"name": "Microsoft Office Professional Plus 2019 - en-us",
|
||||
"size": "0 B",
|
||||
"source": "",
|
||||
"version": "16.0.10368.20035",
|
||||
"location": "C:\\Program Files\\Microsoft Office",
|
||||
"publisher": "Microsoft Corporation",
|
||||
"uninstall": '"C:\\Program Files\\Common Files\\Microsoft Shared\\ClickToRun\\OfficeClickToRun.exe" scenario=install scenariosubtype=ARP sourcetype=None productstoremove=ProPlus2019Volume.16_en-us_x-none culture=en-us version.16=16.0',
|
||||
"install_date": "0001-01-01 00:00:00 +0000 UTC",
|
||||
},
|
||||
]
|
||||
|
||||
self.assertTrue(InstalledSoftware.objects.filter(agent=agent).exists())
|
||||
self.assertEquals(software.software, expected)
|
||||
# test failed attempt
|
||||
nats_cmd.return_value = "timeout"
|
||||
ret = get_installed_software(agent.pk)
|
||||
self.assertFalse(ret)
|
||||
nats_cmd.assert_called_with({"func": "softwarelist"}, timeout=20)
|
||||
nats_cmd.reset_mock()
|
||||
|
||||
# test successful attempt
|
||||
nats_cmd.return_value = nats_return
|
||||
ret = get_installed_software(agent.pk)
|
||||
self.assertTrue(ret)
|
||||
nats_cmd.assert_called_with({"func": "softwarelist"}, timeout=20)
|
||||
|
||||
@patch("agents.models.Agent.salt_api_cmd")
|
||||
@patch("software.tasks.get_installed_software.delay")
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import string
|
||||
|
||||
from django.shortcuts import get_object_or_404
|
||||
@@ -41,35 +42,34 @@ def get_installed(request, pk):
|
||||
@api_view()
|
||||
def refresh_installed(request, pk):
|
||||
agent = get_object_or_404(Agent, pk=pk)
|
||||
r = agent.salt_api_cmd(
|
||||
timeout=20,
|
||||
func="pkg.list_pkgs",
|
||||
kwargs={"include_components": False, "include_updates": False},
|
||||
)
|
||||
if not agent.has_nats:
|
||||
return notify_error("Requires agent version 1.1.0 or greater")
|
||||
|
||||
if r == "timeout":
|
||||
r = asyncio.run(agent.nats_cmd({"func": "softwarelist"}, timeout=15))
|
||||
if r == "timeout" or r == "natsdown":
|
||||
return notify_error("Unable to contact the agent")
|
||||
elif r == "error":
|
||||
return notify_error("Something went wrong")
|
||||
|
||||
printable = set(string.printable)
|
||||
|
||||
try:
|
||||
software = [
|
||||
sw = []
|
||||
for s in r:
|
||||
sw.append(
|
||||
{
|
||||
"name": "".join(filter(lambda x: x in printable, k)),
|
||||
"version": "".join(filter(lambda x: x in printable, v)),
|
||||
"name": "".join(filter(lambda x: x in printable, s["name"])),
|
||||
"version": "".join(filter(lambda x: x in printable, s["version"])),
|
||||
"publisher": "".join(filter(lambda x: x in printable, s["publisher"])),
|
||||
"install_date": s["install_date"],
|
||||
"size": s["size"],
|
||||
"source": s["source"],
|
||||
"location": s["location"],
|
||||
"uninstall": s["uninstall"],
|
||||
}
|
||||
for k, v in r.items()
|
||||
]
|
||||
except Exception:
|
||||
return notify_error("Something went wrong")
|
||||
)
|
||||
|
||||
if not InstalledSoftware.objects.filter(agent=agent).exists():
|
||||
InstalledSoftware(agent=agent, software=software).save()
|
||||
InstalledSoftware(agent=agent, software=sw).save()
|
||||
else:
|
||||
s = agent.installedsoftware_set.get()
|
||||
s.software = software
|
||||
s = agent.installedsoftware_set.first()
|
||||
s.software = sw
|
||||
s.save(update_fields=["software"])
|
||||
|
||||
return Response("ok")
|
||||
|
||||
@@ -27,15 +27,15 @@ app.conf.beat_schedule = {
|
||||
},
|
||||
"auto-approve-win-updates": {
|
||||
"task": "winupdate.tasks.auto_approve_updates_task",
|
||||
"schedule": crontab(minute=0, hour="*/8"),
|
||||
"schedule": crontab(minute=2, hour="*/8"),
|
||||
},
|
||||
"install-scheduled-win-updates": {
|
||||
"task": "winupdate.tasks.check_agent_update_schedule_task",
|
||||
"schedule": crontab(minute=0, hour="*"),
|
||||
"schedule": crontab(minute=5, hour="*"),
|
||||
},
|
||||
"sync-modules": {
|
||||
"task": "agents.tasks.batch_sync_modules_task",
|
||||
"schedule": crontab(minute=40, hour="*/4"),
|
||||
"schedule": crontab(minute=25, hour="*/4"),
|
||||
},
|
||||
"sys-info": {
|
||||
"task": "agents.tasks.batch_sysinfo_task",
|
||||
@@ -43,11 +43,11 @@ app.conf.beat_schedule = {
|
||||
},
|
||||
"update-salt": {
|
||||
"task": "agents.tasks.update_salt_minion_task",
|
||||
"schedule": crontab(minute=30, hour="*/6"),
|
||||
"schedule": crontab(minute=20, hour="*/6"),
|
||||
},
|
||||
"agent-auto-update": {
|
||||
"task": "agents.tasks.auto_self_agent_update_task",
|
||||
"schedule": crontab(minute=50, hour="*/3"),
|
||||
"schedule": crontab(minute=35, hour="*"),
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
@@ -37,7 +37,6 @@ if not DEBUG:
|
||||
)
|
||||
})
|
||||
|
||||
|
||||
SALT_USERNAME = "changeme"
|
||||
SALT_PASSWORD = "changeme"
|
||||
MESH_USERNAME = "changeme"
|
||||
|
||||
@@ -1,8 +1,13 @@
|
||||
import os
|
||||
from pathlib import Path
|
||||
from datetime import timedelta
|
||||
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
SCRIPTS_DIR = "/srv/salt/scripts"
|
||||
|
||||
DOCKER_BUILD = False
|
||||
|
||||
LOG_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/log")
|
||||
|
||||
EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
|
||||
@@ -10,23 +15,25 @@ EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
|
||||
AUTH_USER_MODEL = "accounts.User"
|
||||
|
||||
# latest release
|
||||
TRMM_VERSION = "0.1.5"
|
||||
TRMM_VERSION = "0.2.0"
|
||||
|
||||
# bump this version everytime vue code is changed
|
||||
# to alert user they need to manually refresh their browser
|
||||
APP_VER = "0.0.87"
|
||||
APP_VER = "0.0.91"
|
||||
|
||||
# https://github.com/wh1te909/salt
|
||||
LATEST_SALT_VER = "1.1.0"
|
||||
|
||||
# https://github.com/wh1te909/rmmagent
|
||||
LATEST_AGENT_VER = "1.0.1"
|
||||
LATEST_AGENT_VER = "1.1.0"
|
||||
|
||||
MESH_VER = "0.6.62"
|
||||
MESH_VER = "0.6.84"
|
||||
|
||||
SALT_MASTER_VER = "3002.2"
|
||||
|
||||
# for the update script, bump when need to recreate venv or npm install
|
||||
PIP_VER = "2"
|
||||
NPM_VER = "1"
|
||||
PIP_VER = "3"
|
||||
NPM_VER = "2"
|
||||
|
||||
DL_64 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}.exe"
|
||||
DL_32 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}-x86.exe"
|
||||
@@ -116,15 +123,9 @@ AUTH_PASSWORD_VALIDATORS = [
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
|
||||
},
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
|
||||
},
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator",
|
||||
},
|
||||
{
|
||||
"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",
|
||||
},
|
||||
{"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",},
|
||||
{"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator",},
|
||||
{"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",},
|
||||
]
|
||||
|
||||
|
||||
@@ -173,6 +174,7 @@ if "TRAVIS" in os.environ:
|
||||
|
||||
ADMIN_URL = "abc123456/"
|
||||
|
||||
SCRIPTS_DIR = os.path.join(Path(BASE_DIR).parents[1], "scripts")
|
||||
SALT_USERNAME = "travis"
|
||||
SALT_PASSWORD = "travis"
|
||||
MESH_USERNAME = "travis"
|
||||
@@ -205,6 +207,7 @@ if "AZPIPELINE" in os.environ:
|
||||
|
||||
ADMIN_URL = "abc123456/"
|
||||
|
||||
SCRIPTS_DIR = os.path.join(Path(BASE_DIR).parents[1], "scripts")
|
||||
SALT_USERNAME = "pipeline"
|
||||
SALT_PASSWORD = "pipeline"
|
||||
MESH_USERNAME = "pipeline"
|
||||
|
||||
@@ -1,4 +1,44 @@
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import tldextract
|
||||
|
||||
from django.conf import settings
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
|
||||
from agents.models import Agent
|
||||
|
||||
notify_error = lambda msg: Response(msg, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
def reload_nats():
|
||||
users = [{"user": "tacticalrmm", "password": settings.SECRET_KEY}]
|
||||
agents = Agent.objects.prefetch_related("user").only("pk", "agent_id")
|
||||
for agent in agents:
|
||||
users.append({"user": agent.agent_id, "password": agent.user.auth_token.key})
|
||||
|
||||
if not settings.DOCKER_BUILD:
|
||||
tld = tldextract.extract(settings.ALLOWED_HOSTS[0])
|
||||
domain = tld.domain + "." + tld.suffix
|
||||
cert_path = f"/etc/letsencrypt/live/{domain}"
|
||||
else:
|
||||
cert_path = "/opt/tactical/certs"
|
||||
|
||||
config = {
|
||||
"tls": {
|
||||
"cert_file": f"{cert_path}/fullchain.pem",
|
||||
"key_file": f"{cert_path}/privkey.pem",
|
||||
},
|
||||
"authorization": {"users": users},
|
||||
"max_payload": 2048576005,
|
||||
}
|
||||
|
||||
conf = os.path.join(settings.BASE_DIR, "nats-rmm.conf")
|
||||
with open(conf, "w") as f:
|
||||
json.dump(config, f)
|
||||
|
||||
if not settings.DOCKER_BUILD:
|
||||
subprocess.run(
|
||||
["/usr/local/bin/nats-server", "-signal", "reload"], capture_output=True
|
||||
)
|
||||
|
||||
18
backup.sh
Normal file → Executable file
18
backup.sh
Normal file → Executable file
@@ -1,7 +1,7 @@
|
||||
#!/bin/bash
|
||||
|
||||
SCRIPT_VERSION="2"
|
||||
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/develop/backup.sh'
|
||||
SCRIPT_VERSION="3"
|
||||
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh'
|
||||
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
@@ -31,11 +31,25 @@ POSTGRES_PW="hunter2"
|
||||
|
||||
#####################################################
|
||||
|
||||
if [[ "$POSTGRES_USER" == "changeme" || "$POSTGRES_PW" == "hunter2" ]]; then
|
||||
printf >&2 "${RED}You must change the postgres username/password at the top of this file.${NC}\n"
|
||||
printf >&2 "${RED}Check the github readme for where to find them.${NC}\n"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -d /rmmbackups ]; then
|
||||
sudo mkdir /rmmbackups
|
||||
sudo chown ${USER}:${USER} /rmmbackups
|
||||
fi
|
||||
|
||||
if [ -d /meshcentral/meshcentral-backup ]; then
|
||||
rm -f /meshcentral/meshcentral-backup/*
|
||||
fi
|
||||
|
||||
if [ -d /meshcentral/meshcentral-coredumps ]; then
|
||||
rm -f /meshcentral/meshcentral-coredumps/*
|
||||
fi
|
||||
|
||||
dt_now=$(date '+%Y_%m_%d__%H_%M_%S')
|
||||
tmp_dir=$(mktemp -d -t tacticalrmm-XXXXXXXXXXXXXXXXXXXXX)
|
||||
sysd="/etc/systemd/system"
|
||||
|
||||
@@ -1,24 +1,21 @@
|
||||
MESH_HOST=mesh.example.com
|
||||
MESH_USER=mesh
|
||||
MESH_PASS=meshpass
|
||||
EMAIL_USER=admin@example.com
|
||||
IMAGE_REPO=tacticalrmm/
|
||||
VERSION=latest
|
||||
|
||||
# tactical credentials (Used to login to dashboard)
|
||||
TRMM_USER=tactical
|
||||
TRMM_PASS=tactical
|
||||
|
||||
# dns settings
|
||||
APP_HOST=app.example.com
|
||||
API_HOST=api.example.com
|
||||
MESH_HOST=mesh.example.com
|
||||
|
||||
# mesh settings
|
||||
MESH_USER=meshcentral
|
||||
MESH_PASS=meshcentralpass
|
||||
MONGODB_USER=mongouser
|
||||
MONGODB_PASSWORD=mongopass
|
||||
|
||||
# database settings
|
||||
POSTGRES_USER=postgres
|
||||
POSTGRES_PASS=pass
|
||||
POSTGRES_HOST=db
|
||||
|
||||
APP_HOST=app.example.com
|
||||
API_HOST=api.example.com
|
||||
|
||||
REDIS_HOST=redis
|
||||
|
||||
SALT_HOST=salt
|
||||
SALT_USER=saltapi
|
||||
SALT_PASS=password
|
||||
|
||||
ADMIN_URL=admin
|
||||
DJANGO_SEKRET=secret12341234123412341234
|
||||
DJANGO_DEBUG=False
|
||||
POSTGRES_PASS=postgrespass
|
||||
|
||||
@@ -1,64 +0,0 @@
|
||||
user nginx;
|
||||
worker_processes 1;
|
||||
error_log /var/log/nginx/error.log warn;
|
||||
pid /var/run/nginx.pid;
|
||||
|
||||
events {
|
||||
worker_connections 1024;
|
||||
}
|
||||
|
||||
http {
|
||||
include /etc/nginx/mime.types;
|
||||
default_type application/octet-stream;
|
||||
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
|
||||
'$status $body_bytes_sent "$http_referer" '
|
||||
'"$http_user_agent" "$http_x_forwarded_for"';
|
||||
access_log /var/log/nginx/access.log main;
|
||||
sendfile on;
|
||||
keepalive_timeout 65;
|
||||
|
||||
server_tokens off;
|
||||
|
||||
upstream tacticalrmm {
|
||||
server unix:///app/tacticalrmm.sock;
|
||||
}
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
#server_name ${API_HOST};
|
||||
client_max_body_size 300M;
|
||||
access_log /var/log/nginx/api-access.log;
|
||||
error_log /var/log/nginx/api-error.log;
|
||||
|
||||
location /static/ {
|
||||
root /app;
|
||||
}
|
||||
|
||||
location /private/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias /app/tacticalrmm/private/;
|
||||
}
|
||||
|
||||
location /saltscripts/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias /srv/salt/scripts/userdefined/;
|
||||
}
|
||||
|
||||
location /builtin/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias /srv/salt/scripts/;
|
||||
}
|
||||
|
||||
location / {
|
||||
uwsgi_pass tacticalrmm;
|
||||
include /etc/nginx/uwsgi_params;
|
||||
uwsgi_read_timeout 9999s;
|
||||
uwsgi_ignore_client_abort on;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
daemon off;
|
||||
@@ -1,45 +0,0 @@
|
||||
FROM tiangolo/uwsgi-nginx:python3.8
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
ARG DJANGO_SEKRET
|
||||
ARG DJANGO_DEBUG
|
||||
ARG POSTGRES_USER
|
||||
ARG POSTGRES_PASS
|
||||
ARG POSTGRES_HOST
|
||||
ARG SALT_HOST
|
||||
ARG SALT_USER
|
||||
ARG SALT_PASS
|
||||
ARG REDIS_HOST
|
||||
ARG MESH_USER
|
||||
ARG MESH_HOST
|
||||
ARG MESH_TOKEN_KEY
|
||||
ARG APP_HOST
|
||||
ARG API_HOST
|
||||
ARG ADMIN_URL
|
||||
|
||||
EXPOSE 80
|
||||
|
||||
RUN apt-get update && apt-get install -y gettext-base wget
|
||||
COPY ./api/tacticalrmm/requirements.txt .
|
||||
RUN pip install --upgrade pip
|
||||
RUN pip install --no-cache-dir setuptools==49.6.0 wheel==0.35.1
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
RUN wget --no-check-certificate https://golang.org/dl/go1.15.linux-amd64.tar.gz -P /tmp
|
||||
COPY ./api/tacticalrmm/ .
|
||||
COPY ./docker/api/prestart.sh .
|
||||
COPY ./docker/api/uwsgi.ini .
|
||||
COPY ./docker/api/api.conf /app/api.conf.tmp
|
||||
RUN envsubst '\$APP_HOST, \$API_HOST' < /app/api.conf.tmp > /app/nginx.conf && \
|
||||
rm /app/api.conf.tmp
|
||||
COPY ./docker/api/local_settings.py.keep ./tacticalrmm/local_settings.py.tmp
|
||||
RUN envsubst < /app/tacticalrmm/local_settings.py.tmp > /app/tacticalrmm/local_settings.py && rm /app/tacticalrmm/local_settings.py.tmp
|
||||
|
||||
RUN tar -xzf /tmp/go1.15.linux-amd64.tar.gz -C /tmp && \
|
||||
mkdir /usr/local/rmmgo && \
|
||||
mv /tmp/go /usr/local/rmmgo/ && \
|
||||
rm -rf /tmp/go
|
||||
|
||||
RUN /usr/local/rmmgo/go/bin/go get github.com/josephspurrier/goversioninfo/cmd/goversioninfo && \
|
||||
cp ./api/tacticalrmm/core/goinstaller/bin/goversioninfo /usr/local/bin/ && \
|
||||
chmod +x /usr/local/bin/goversioninfo
|
||||
@@ -1,47 +0,0 @@
|
||||
SECRET_KEY = '${DJANGO_SEKRET}'
|
||||
|
||||
ALLOWED_HOSTS = ['${API_HOST}']
|
||||
|
||||
ADMIN_URL = "${ADMIN_URL}"
|
||||
|
||||
CORS_ORIGIN_WHITELIST = ["https://${APP_HOST}",]
|
||||
|
||||
DEBUG = ${DJANGO_DEBUG}
|
||||
|
||||
DATABASES = {
|
||||
'default': {
|
||||
'ENGINE': 'django.db.backends.postgresql',
|
||||
'NAME': 'tacticalrmm',
|
||||
'USER': '${POSTGRES_USER}',
|
||||
'PASSWORD': '${POSTGRES_PASS}',
|
||||
'HOST': '${POSTGRES_HOST}',
|
||||
'PORT': '5432',
|
||||
}
|
||||
}
|
||||
|
||||
REST_FRAMEWORK = {
|
||||
'DATETIME_FORMAT': "%b-%d-%Y - %H:%M",
|
||||
|
||||
'DEFAULT_PERMISSION_CLASSES': (
|
||||
'rest_framework.permissions.IsAuthenticated',
|
||||
),
|
||||
'DEFAULT_AUTHENTICATION_CLASSES': (
|
||||
'knox.auth.TokenAuthentication',
|
||||
),
|
||||
}
|
||||
|
||||
if not DEBUG:
|
||||
REST_FRAMEWORK.update({
|
||||
'DEFAULT_RENDERER_CLASSES': (
|
||||
'rest_framework.renderers.JSONRenderer',
|
||||
)
|
||||
})
|
||||
|
||||
SALT_USERNAME = "${SALT_USER}"
|
||||
SALT_PASSWORD = "${SALT_PASS}"
|
||||
MESH_USERNAME = "${MESH_USER}"
|
||||
MESH_SITE = "https://${MESH_HOST}"
|
||||
MESH_WS_URL="ws://meshcentral:443"
|
||||
MESH_TOKEN_KEY = "${MESH_TOKEN_KEY}"
|
||||
REDIS_HOST = "${REDIS_HOST}"
|
||||
SALT_HOST = "${SALT_HOST}"
|
||||
@@ -1,10 +0,0 @@
|
||||
#! /usr/bin/env bash
|
||||
|
||||
sleep 10
|
||||
python manage.py migrate --no-input
|
||||
python manage.py collectstatic --no-input
|
||||
python manage.py initial_db_setup
|
||||
python manage.py initial_mesh_setup
|
||||
python manage.py load_chocos
|
||||
python manage.py fix_salt_key
|
||||
python manage.py load_community_scripts
|
||||
@@ -1,14 +0,0 @@
|
||||
[uwsgi]
|
||||
|
||||
logto = /app/tacticalrmm/private/log/uwsgi.log
|
||||
chdir = /app
|
||||
wsgi-file = tacticalrmm/wsgi.py
|
||||
master = true
|
||||
processes = 4
|
||||
threads = 2
|
||||
socket = /app/tacticalrmm.sock
|
||||
# clear environment on exit
|
||||
vacuum = true
|
||||
die-on-term = true
|
||||
max-requests = 500
|
||||
max-requests-delta = 1000
|
||||
@@ -1,2 +0,0 @@
|
||||
PROD_URL = "https://${API_HOST}"
|
||||
DEV_URL = "https://${API_HOST}"
|
||||
@@ -1,16 +0,0 @@
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
#server_name ${APP_HOST};
|
||||
charset utf-8;
|
||||
|
||||
location / {
|
||||
root /usr/share/nginx/html;
|
||||
try_files $uri $uri/ /index.html;
|
||||
add_header Cache-Control "no-store, no-cache, must-revalidate";
|
||||
add_header Pragma "no-cache";
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/app-error.log;
|
||||
access_log /var/log/nginx/app-access.log;
|
||||
}
|
||||
@@ -1,19 +0,0 @@
|
||||
FROM node:12-alpine AS builder
|
||||
ARG APP_HOST
|
||||
ARG API_HOST
|
||||
EXPOSE 80
|
||||
WORKDIR /home/node
|
||||
RUN apk add gettext
|
||||
COPY ./web/package.json .
|
||||
RUN npm install
|
||||
COPY ./docker/app/.env.keep /home/.env.tmp
|
||||
RUN envsubst '\$API_HOST' < /home/.env.tmp > /home/node/.env && rm /home/.env.tmp
|
||||
COPY ./docker/app/app.conf /home/node/app.conf.tmp
|
||||
RUN envsubst '\$APP_HOST' < /home/node/app.conf.tmp > /home/node/app.conf
|
||||
COPY ./web .
|
||||
RUN npm run build
|
||||
|
||||
FROM nginx:alpine
|
||||
WORKDIR /usr/share/nginx/html
|
||||
COPY --from=builder /home/node/dist .
|
||||
COPY --from=builder /home/node/app.conf /etc/nginx/conf.d/default.conf
|
||||
29
docker/containers/tactical-frontend/dockerfile
Normal file
29
docker/containers/tactical-frontend/dockerfile
Normal file
@@ -0,0 +1,29 @@
|
||||
FROM node:12-alpine AS builder
|
||||
|
||||
WORKDIR /home/node/app
|
||||
|
||||
COPY ./web/package.json .
|
||||
RUN npm install
|
||||
|
||||
COPY ./web .
|
||||
|
||||
# copy env file to set DOCKER_BUILD to true
|
||||
RUN echo "DOCKER_BUILD=1" > .env
|
||||
|
||||
# modify index.html template to allow injection of js variables at runtime
|
||||
RUN sed -i '/<\/head>/i <script src="\/env-config.js"><\/script>' src/index.template.html
|
||||
RUN npm run build
|
||||
|
||||
FROM nginx:stable-alpine
|
||||
|
||||
ENV PUBLIC_DIR /usr/share/nginx/html
|
||||
|
||||
RUN apk add --no-cache bash
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
COPY --from=builder /home/node/app/dist/ ${PUBLIC_DIR}
|
||||
|
||||
COPY docker/containers/tactical-frontend/entrypoint.sh /docker-entrypoint.d/
|
||||
RUN chmod +x /docker-entrypoint.d/entrypoint.sh
|
||||
|
||||
EXPOSE 80
|
||||
31
docker/containers/tactical-frontend/entrypoint.sh
Normal file
31
docker/containers/tactical-frontend/entrypoint.sh
Normal file
@@ -0,0 +1,31 @@
|
||||
#!/usr/bin/env bash
|
||||
#
|
||||
# https://www.freecodecamp.org/news/how-to-implement-runtime-environment-variables-with-create-react-app-docker-and-nginx-7f9d42a91d70/
|
||||
#
|
||||
|
||||
# Recreate js config file on start
|
||||
rm -rf ${PUBLIC_DIR}/env-config.js
|
||||
touch ${PUBLIC_DIR}/env-config.js
|
||||
|
||||
# Add runtime base url assignment
|
||||
echo "window._env_ = {PROD_URL: \"https://${API_HOST}\"}" >> ${PUBLIC_DIR}/env-config.js
|
||||
|
||||
nginx_config="$(cat << EOF
|
||||
server {
|
||||
listen 80;
|
||||
charset utf-8;
|
||||
|
||||
location / {
|
||||
root /usr/share/nginx/html;
|
||||
try_files \$uri \$uri/ /index.html;
|
||||
add_header Cache-Control "no-store, no-cache, must-revalidate";
|
||||
add_header Pragma "no-cache";
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/app-error.log;
|
||||
access_log /var/log/nginx/app-access.log;
|
||||
}
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${nginx_config}" > /etc/nginx/conf.d/default.conf
|
||||
18
docker/containers/tactical-meshcentral/dockerfile
Normal file
18
docker/containers/tactical-meshcentral/dockerfile
Normal file
@@ -0,0 +1,18 @@
|
||||
FROM node:12-alpine
|
||||
|
||||
WORKDIR /home/node/app
|
||||
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
|
||||
RUN apk add --no-cache bash
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
RUN npm install meshcentral@0.6.62
|
||||
|
||||
COPY docker/containers/tactical-meshcentral/entrypoint.sh /
|
||||
RUN chmod +x /entrypoint.sh
|
||||
|
||||
EXPOSE 80 443
|
||||
|
||||
ENTRYPOINT [ "/entrypoint.sh" ]
|
||||
66
docker/containers/tactical-meshcentral/entrypoint.sh
Normal file
66
docker/containers/tactical-meshcentral/entrypoint.sh
Normal file
@@ -0,0 +1,66 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
: "${MESH_USER:=meshcentral}"
|
||||
: "${MESH_PASS:=meshcentralpass}"
|
||||
: "${MONGODB_USER:=mongouser}"
|
||||
: "${MONGODB_PASSWORD:=mongopass}"
|
||||
: "${MONGODB_HOST:=tactical-mongodb}"
|
||||
: "${MONGODB_PORT:=27017}"
|
||||
: "${NGINX_HOST_IP:=172.20.0.20}"
|
||||
|
||||
mkdir -p /home/node/app/meshcentral-data
|
||||
mkdir -p ${TACTICAL_DIR}/tmp
|
||||
|
||||
mesh_config="$(cat << EOF
|
||||
{
|
||||
"settings": {
|
||||
"mongodb": "mongodb://${MONGODB_USER}:${MONGODB_PASSWORD}@${MONGODB_HOST}:${MONGODB_PORT}",
|
||||
"Cert": "${MESH_HOST}",
|
||||
"TLSOffload": "${NGINX_HOST_IP}",
|
||||
"RedirPort": 80,
|
||||
"WANonly": true,
|
||||
"Minify": 1,
|
||||
"Port": 443,
|
||||
"AllowLoginToken": true,
|
||||
"AllowFraming": true,
|
||||
"_AgentPing": 60,
|
||||
"AgentPong": 300,
|
||||
"AllowHighQualityDesktop": true,
|
||||
"MaxInvalidLogin": {
|
||||
"time": 5,
|
||||
"count": 5,
|
||||
"coolofftime": 30
|
||||
}
|
||||
},
|
||||
"domains": {
|
||||
"": {
|
||||
"Title": "Tactical RMM",
|
||||
"Title2": "TacticalRMM",
|
||||
"NewAccounts": false,
|
||||
"mstsc": true,
|
||||
"GeoLocation": true,
|
||||
"CertUrl": "https://${NGINX_HOST_IP}:443",
|
||||
"httpheaders": {
|
||||
"Strict-Transport-Security": "max-age=360000",
|
||||
"_x-frame-options": "sameorigin",
|
||||
"Content-Security-Policy": "default-src 'none'; script-src 'self' 'unsafe-inline'; connect-src 'self'; img-src 'self' data:; style-src 'self' 'unsafe-inline'; frame-src 'self'; media-src 'self'"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${mesh_config}" > /home/node/app/meshcentral-data/config.json
|
||||
|
||||
node node_modules/meshcentral --createaccount ${MESH_USER} --pass ${MESH_PASS} --email example@example.com
|
||||
node node_modules/meshcentral --adminaccount ${MESH_USER}
|
||||
|
||||
if [ ! -f "${TACTICAL_DIR}/tmp/mesh_token" ]; then
|
||||
node node_modules/meshcentral --logintokenkey > ${TACTICAL_DIR}/tmp/mesh_token
|
||||
fi
|
||||
|
||||
# start mesh
|
||||
node node_modules/meshcentral
|
||||
15
docker/containers/tactical-nats/dockerfile
Normal file
15
docker/containers/tactical-nats/dockerfile
Normal file
@@ -0,0 +1,15 @@
|
||||
FROM nats:2.1-alpine
|
||||
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
ENV TACTICAL_READY_FILE ${TACTICAL_DIR}/tmp/tactical.ready
|
||||
|
||||
RUN apk add --no-cache inotify-tools supervisor bash
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
COPY docker/containers/tactical-nats/entrypoint.sh /
|
||||
RUN chmod +x /entrypoint.sh
|
||||
|
||||
ENTRYPOINT [ "/entrypoint.sh" ]
|
||||
|
||||
EXPOSE 4222
|
||||
37
docker/containers/tactical-nats/entrypoint.sh
Normal file
37
docker/containers/tactical-nats/entrypoint.sh
Normal file
@@ -0,0 +1,37 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
sleep 15
|
||||
until [ -f "${TACTICAL_READY_FILE}" ]; do
|
||||
echo "waiting for init container to finish install or update..."
|
||||
sleep 10
|
||||
done
|
||||
|
||||
mkdir -p /var/log/supervisor
|
||||
mkdir -p /etc/supervisor/conf.d
|
||||
|
||||
supervisor_config="$(cat << EOF
|
||||
[supervisord]
|
||||
nodaemon=true
|
||||
[include]
|
||||
files = /etc/supervisor/conf.d/*.conf
|
||||
|
||||
[program:nats-server]
|
||||
command=nats-server -DVV --config "${TACTICAL_DIR}/api/nats-rmm.conf"
|
||||
stdout_logfile=/dev/fd/1
|
||||
stdout_logfile_maxbytes=0
|
||||
redirect_stderr=true
|
||||
|
||||
[program:config-watcher]
|
||||
command="inotifywait -m -e close_write ${TACTICAL_DIR}/api/nats-rmm.conf"; | while read events; do "nats-server --signal reload"; done;
|
||||
stdout_logfile=/dev/fd/1
|
||||
stdout_logfile_maxbytes=0
|
||||
redirect_stderr=true
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${supervisor_config}" > /etc/supervisor/conf.d/supervisor.conf
|
||||
|
||||
# run supervised processes
|
||||
/usr/bin/supervisord -c /etc/supervisor/conf.d/supervisor.conf
|
||||
12
docker/containers/tactical-nginx/dockerfile
Normal file
12
docker/containers/tactical-nginx/dockerfile
Normal file
@@ -0,0 +1,12 @@
|
||||
FROM nginx:stable-alpine
|
||||
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
|
||||
RUN apk add --no-cache openssl bash
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
COPY docker/containers/tactical-nginx/entrypoint.sh /docker-entrypoint.d/
|
||||
RUN chmod +x /docker-entrypoint.d/entrypoint.sh
|
||||
|
||||
EXPOSE 443 80
|
||||
173
docker/containers/tactical-nginx/entrypoint.sh
Normal file
173
docker/containers/tactical-nginx/entrypoint.sh
Normal file
@@ -0,0 +1,173 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
CERT_PRIV_PATH=${TACTICAL_DIR}/certs/privkey.pem
|
||||
CERT_PUB_PATH=${TACTICAL_DIR}/certs/fullchain.pem
|
||||
|
||||
mkdir -p "${TACTICAL_DIR}/certs"
|
||||
|
||||
# remove default config
|
||||
rm -f /etc/nginx/conf.d/default.conf
|
||||
|
||||
# check for certificates in env variable
|
||||
if [ ! -z "$CERT_PRIV_KEY" ] && [ ! -z "$CERT_PUB_KEY" ]; then
|
||||
echo "${CERT_PRIV_KEY}" | base64 -d > ${CERT_PRIV_PATH}
|
||||
echo "${CERT_PUB_KEY}" | base64 -d > ${CERT_PUB_PATH}
|
||||
else
|
||||
# generate a self signed cert
|
||||
if [ ! -f "${CERT_PRIV_PATH}" ] || [ ! -f "${CERT_PUB_PATH}" ]; then
|
||||
rootdomain=$(echo ${API_HOST} | cut -d "." -f2- )
|
||||
openssl req -newkey rsa:4096 -x509 -sha256 -days 365 -nodes -out ${CERT_PUB_PATH} -keyout ${CERT_PRIV_PATH} -subj "/C=US/ST=Some-State/L=city/O=Internet Widgits Pty Ltd/CN=*.${rootdomain}"
|
||||
fi
|
||||
fi
|
||||
|
||||
nginx_config="$(cat << EOF
|
||||
# backend config
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
server_name ${API_HOST};
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set \$api http://tactical-backend;
|
||||
|
||||
proxy_pass \$api;
|
||||
proxy_http_version 1.1;
|
||||
proxy_cache_bypass \$http_upgrade;
|
||||
|
||||
proxy_set_header Upgrade \$http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host \$host;
|
||||
proxy_set_header X-Real-IP \$remote_addr;
|
||||
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto \$scheme;
|
||||
proxy_set_header X-Forwarded-Host \$host;
|
||||
proxy_set_header X-Forwarded-Port \$server_port;
|
||||
}
|
||||
|
||||
location /static/ {
|
||||
root ${TACTICAL_DIR}/api;
|
||||
}
|
||||
|
||||
location /private/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias ${TACTICAL_DIR}/api/tacticalrmm/private/;
|
||||
}
|
||||
|
||||
location /saltscripts/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias ${TACTICAL_DIR}/scripts/userdefined/;
|
||||
}
|
||||
|
||||
location /builtin/ {
|
||||
internal;
|
||||
add_header "Access-Control-Allow-Origin" "https://${APP_HOST}";
|
||||
alias ${TACTICAL_DIR}/scripts/;
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/api-error.log;
|
||||
access_log /var/log/nginx/api-access.log;
|
||||
|
||||
client_max_body_size 300M;
|
||||
|
||||
listen 443 ssl;
|
||||
ssl_certificate ${CERT_PUB_PATH};
|
||||
ssl_certificate_key ${CERT_PRIV_PATH};
|
||||
ssl_ciphers 'ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256';
|
||||
|
||||
}
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
server_name ${API_HOST};
|
||||
return 301 https://\$server_name\$request_uri;
|
||||
}
|
||||
|
||||
# frontend config
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
server_name ${APP_HOST};
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set \$app http://tactical-frontend;
|
||||
|
||||
proxy_pass \$app;
|
||||
proxy_http_version 1.1;
|
||||
proxy_cache_bypass \$http_upgrade;
|
||||
|
||||
proxy_set_header Upgrade \$http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host \$host;
|
||||
proxy_set_header X-Real-IP \$remote_addr;
|
||||
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto \$scheme;
|
||||
proxy_set_header X-Forwarded-Host \$host;
|
||||
proxy_set_header X-Forwarded-Port \$server_port;
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/app-error.log;
|
||||
access_log /var/log/nginx/app-access.log;
|
||||
|
||||
listen 443 ssl;
|
||||
ssl_certificate ${CERT_PUB_PATH};
|
||||
ssl_certificate_key ${CERT_PRIV_PATH};
|
||||
ssl_ciphers 'ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256';
|
||||
|
||||
}
|
||||
|
||||
server {
|
||||
|
||||
listen 80;
|
||||
server_name ${APP_HOST};
|
||||
return 301 https://\$server_name\$request_uri;
|
||||
}
|
||||
|
||||
# meshcentral config
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
listen 443 ssl;
|
||||
proxy_send_timeout 330s;
|
||||
proxy_read_timeout 330s;
|
||||
server_name ${MESH_HOST};
|
||||
ssl_certificate ${CERT_PUB_PATH};
|
||||
ssl_certificate_key ${CERT_PRIV_PATH};
|
||||
ssl_session_cache shared:WEBSSL:10m;
|
||||
ssl_ciphers HIGH:!aNULL:!MD5;
|
||||
ssl_prefer_server_ciphers on;
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set \$meshcentral http://tactical-meshcentral:443;
|
||||
|
||||
proxy_pass \$meshcentral;
|
||||
proxy_http_version 1.1;
|
||||
|
||||
proxy_set_header Upgrade \$http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
|
||||
proxy_set_header Host \$host;
|
||||
proxy_set_header X-Real-IP \$remote_addr;
|
||||
proxy_set_header X-Forwarded-Host \$host:\$server_port;
|
||||
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto \$scheme;
|
||||
}
|
||||
}
|
||||
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
listen 80;
|
||||
server_name ${MESH_HOST};
|
||||
return 301 https://\$server_name\$request_uri;
|
||||
}
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${nginx_config}" > /etc/nginx/conf.d/default.conf
|
||||
21
docker/containers/tactical-salt/dockerfile
Normal file
21
docker/containers/tactical-salt/dockerfile
Normal file
@@ -0,0 +1,21 @@
|
||||
FROM ubuntu:20.04
|
||||
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
ENV SALT_USER saltapi
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y ca-certificates wget gnupg2 tzdata supervisor && \
|
||||
wget -O - https://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest/SALTSTACK-GPG-KEY.pub | apt-key add - && \
|
||||
echo 'deb http://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest focal main' | tee /etc/apt/sources.list.d/saltstack.list && \
|
||||
apt-get update && \
|
||||
apt-get install -y salt-master salt-api && \
|
||||
mkdir -p /var/log/supervisor && \
|
||||
sed -i 's/msgpack_kwargs = {"raw": six.PY2}/msgpack_kwargs = {"raw": six.PY2, "max_buffer_size": 2147483647}/g' /usr/lib/python3/dist-packages/salt/transport/ipc.py && \
|
||||
adduser --no-create-home --disabled-password --gecos "" ${SALT_USER}
|
||||
|
||||
EXPOSE 8123 4505 4506
|
||||
|
||||
COPY docker/containers/tactical-salt/entrypoint.sh /
|
||||
RUN chmod +x /entrypoint.sh
|
||||
|
||||
ENTRYPOINT [ "/entrypoint.sh" ]
|
||||
57
docker/containers/tactical-salt/entrypoint.sh
Normal file
57
docker/containers/tactical-salt/entrypoint.sh
Normal file
@@ -0,0 +1,57 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
: "${SALT_USER:='saltapi'}"
|
||||
|
||||
# wait for salt password to be generated by tactical-init
|
||||
until [ -f "${TACTICAL_DIR}/tmp/salt_pass" ]; do
|
||||
echo "waiting for salt password to be generated..."
|
||||
sleep 10
|
||||
done
|
||||
|
||||
SALT_PASS=$(cat ${TACTICAL_DIR}/tmp/salt_pass)
|
||||
|
||||
echo "${SALT_USER}:${SALT_PASS}" | chpasswd
|
||||
|
||||
cherrypy_config="$(cat << EOF
|
||||
module_dirs: ['/opt/tactical/_modules']
|
||||
timeout: 20
|
||||
gather_job_timeout: 25
|
||||
max_event_size: 30485760
|
||||
external_auth:
|
||||
pam:
|
||||
${SALT_USER}:
|
||||
- .*
|
||||
- '@runner'
|
||||
- '@wheel'
|
||||
- '@jobs'
|
||||
rest_cherrypy:
|
||||
port: 8123
|
||||
disable_ssl: True
|
||||
max_request_body_size: 30485760
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${cherrypy_config}" > /etc/salt/master.d/rmm-salt.conf
|
||||
|
||||
supervisor_config="$(cat << EOF
|
||||
[supervisord]
|
||||
nodaemon=true
|
||||
[include]
|
||||
files = /etc/supervisor/conf.d/*.conf
|
||||
|
||||
[program:salt-master]
|
||||
command=/bin/bash -c "salt-master -l debug"
|
||||
redirect_stderr=true
|
||||
|
||||
[program:salt-api]
|
||||
command=/bin/bash -c "salt-api -l debug"
|
||||
redirect_stderr=true
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${supervisor_config}" > /etc/supervisor/conf.d/supervisor.conf
|
||||
|
||||
# run salt and salt master
|
||||
/usr/bin/supervisord
|
||||
69
docker/containers/tactical/dockerfile
Normal file
69
docker/containers/tactical/dockerfile
Normal file
@@ -0,0 +1,69 @@
|
||||
# creates python virtual env
|
||||
FROM python:3.8-slim AS CREATE_VENV_STAGE
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
# # set env variables
|
||||
ENV VIRTUAL_ENV /opt/venv
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
ENV TACTICAL_TMP_DIR /tmp/tactical
|
||||
RUN python3 -m venv $VIRTUAL_ENV
|
||||
ENV PATH "${VIRTUAL_ENV}/bin:$PATH"
|
||||
|
||||
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]
|
||||
|
||||
COPY api/tacticalrmm/requirements.txt ${TACTICAL_TMP_DIR}/api/requirements.txt
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends gcc libc6-dev && \
|
||||
rm -rf /var/lib/apt/lists/* && \
|
||||
pip install --upgrade pip && \
|
||||
pip install --no-cache-dir setuptools wheel gunicorn && \
|
||||
sed -i '/uWSGI/d' ${TACTICAL_TMP_DIR}/api/requirements.txt && \
|
||||
pip install --no-cache-dir -r ${TACTICAL_TMP_DIR}/api/requirements.txt
|
||||
|
||||
|
||||
# runtime image
|
||||
FROM python:3.8-slim
|
||||
|
||||
# set env variables
|
||||
ENV VIRTUAL_ENV /opt/venv
|
||||
ENV TACTICAL_DIR /opt/tactical
|
||||
ENV TACTICAL_TMP_DIR /tmp/tactical
|
||||
ENV TACTICAL_GO_DIR /usr/local/rmmgo
|
||||
ENV TACTICAL_READY_FILE ${TACTICAL_DIR}/tmp/tactical.ready
|
||||
ENV TACTICAL_USER tactical
|
||||
ENV PATH "${VIRTUAL_ENV}/bin:${TACTICAL_GO_DIR}/go/bin:$PATH"
|
||||
|
||||
# copy files from repo
|
||||
COPY api/tacticalrmm ${TACTICAL_TMP_DIR}/api
|
||||
COPY scripts ${TACTICAL_TMP_DIR}/scripts
|
||||
COPY _modules ${TACTICAL_TMP_DIR}/_modules
|
||||
|
||||
# copy go install from build stage
|
||||
COPY --from=golang:1.15 /usr/local/go ${TACTICAL_GO_DIR}/go
|
||||
COPY --from=CREATE_VENV_STAGE ${VIRTUAL_ENV} ${VIRTUAL_ENV}
|
||||
|
||||
# install deps
|
||||
RUN apt-get update && \
|
||||
apt-get upgrade -y && \
|
||||
apt-get install -y --no-install-recommends git && \
|
||||
rm -rf /var/lib/apt/lists/* && \
|
||||
go get github.com/josephspurrier/goversioninfo/cmd/goversioninfo && \
|
||||
groupadd -g 1000 "${TACTICAL_USER}" && \
|
||||
useradd -M -d "${TACTICAL_DIR}" -s /bin/bash -u 1000 -g 1000 "${TACTICAL_USER}"
|
||||
|
||||
SHELL ["/bin/bash", "-e", "-o", "pipefail", "-c"]
|
||||
|
||||
# overwrite goversioninfo file
|
||||
COPY api/tacticalrmm/core/goinstaller/bin/goversioninfo /usr/local/bin/goversioninfo
|
||||
RUN chmod +x /usr/local/bin/goversioninfo
|
||||
|
||||
# docker init
|
||||
COPY docker/containers/tactical/entrypoint.sh /
|
||||
RUN chmod +x /entrypoint.sh
|
||||
ENTRYPOINT ["/entrypoint.sh"]
|
||||
|
||||
WORKDIR ${TACTICAL_DIR}/api
|
||||
|
||||
EXPOSE 80
|
||||
180
docker/containers/tactical/entrypoint.sh
Normal file
180
docker/containers/tactical/entrypoint.sh
Normal file
@@ -0,0 +1,180 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
: "${TRMM_USER:=tactical}"
|
||||
: "${TRMM_PASS:=tactical}"
|
||||
: "${POSTGRES_HOST:=tactical-postgres}"
|
||||
: "${POSTGRES_PORT:=5432}"
|
||||
: "${POSTGRES_USER:=tactical}"
|
||||
: "${POSTGRES_PASS:=tactical}"
|
||||
: "${POSTGRES_DB:=tacticalrmm}"
|
||||
: "${SALT_HOST:=tactical-salt}"
|
||||
: "${SALT_USER:=saltapi}"
|
||||
: "${MESH_CONTAINER:=tactical-meshcentral}"
|
||||
: "${MESH_USER:=meshcentral}"
|
||||
: "${MESH_PASS:=meshcentralpass}"
|
||||
: "${MESH_HOST:=tactical-meshcentral}"
|
||||
: "${API_HOST:=tactical-backend}"
|
||||
: "${APP_HOST:=tactical-frontend}"
|
||||
: "${REDIS_HOST:=tactical-redis}"
|
||||
|
||||
|
||||
function check_tactical_ready {
|
||||
sleep 15
|
||||
until [ -f "${TACTICAL_READY_FILE}" ]; do
|
||||
echo "waiting for init container to finish install or update..."
|
||||
sleep 10
|
||||
done
|
||||
}
|
||||
|
||||
# tactical-init
|
||||
if [ "$1" = 'tactical-init' ]; then
|
||||
|
||||
mkdir -p ${TACTICAL_DIR}/tmp
|
||||
mkdir -p ${TACTICAL_DIR}/scripts/userdefined
|
||||
|
||||
test -f "${TACTICAL_READY_FILE}" && rm "${TACTICAL_READY_FILE}"
|
||||
|
||||
# copy container data to volume
|
||||
cp -af ${TACTICAL_TMP_DIR}/. ${TACTICAL_DIR}/
|
||||
|
||||
until (echo > /dev/tcp/"${POSTGRES_HOST}"/"${POSTGRES_PORT}") &> /dev/null; do
|
||||
echo "waiting for postgresql container to be ready..."
|
||||
sleep 5
|
||||
done
|
||||
|
||||
until (echo > /dev/tcp/"${MESH_CONTAINER}"/443) &> /dev/null; do
|
||||
echo "waiting for meshcentral container to be ready..."
|
||||
sleep 5
|
||||
done
|
||||
|
||||
# configure django settings
|
||||
MESH_TOKEN=$(cat ${TACTICAL_DIR}/tmp/mesh_token)
|
||||
ADMINURL=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 70 | head -n 1)
|
||||
DJANGO_SEKRET=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 80 | head -n 1)
|
||||
|
||||
# write salt pass to tmp dir
|
||||
if [ ! -f "${TACTICAL__DIR}/tmp/salt_pass" ]; then
|
||||
SALT_PASS=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1)
|
||||
echo "${SALT_PASS}" > ${TACTICAL_DIR}/tmp/salt_pass
|
||||
else
|
||||
SALT_PASS=$(cat ${TACTICAL_DIR}/tmp/salt_pass)
|
||||
fi
|
||||
|
||||
localvars="$(cat << EOF
|
||||
SECRET_KEY = '${DJANGO_SEKRET}'
|
||||
|
||||
DEBUG = False
|
||||
|
||||
DOCKER_BUILD = True
|
||||
|
||||
SCRIPTS_DIR = '/opt/tactical/scripts'
|
||||
|
||||
ALLOWED_HOSTS = ['${API_HOST}']
|
||||
|
||||
ADMIN_URL = '${ADMINURL}/'
|
||||
|
||||
CORS_ORIGIN_WHITELIST = [
|
||||
'https://${APP_HOST}'
|
||||
]
|
||||
|
||||
DATABASES = {
|
||||
'default': {
|
||||
'ENGINE': 'django.db.backends.postgresql',
|
||||
'NAME': '${POSTGRES_DB}',
|
||||
'USER': '${POSTGRES_USER}',
|
||||
'PASSWORD': '${POSTGRES_PASS}',
|
||||
'HOST': '${POSTGRES_HOST}',
|
||||
'PORT': '${POSTGRES_PORT}',
|
||||
}
|
||||
}
|
||||
|
||||
REST_FRAMEWORK = {
|
||||
'DATETIME_FORMAT': '%b-%d-%Y - %H:%M',
|
||||
|
||||
'DEFAULT_PERMISSION_CLASSES': (
|
||||
'rest_framework.permissions.IsAuthenticated',
|
||||
),
|
||||
'DEFAULT_AUTHENTICATION_CLASSES': (
|
||||
'knox.auth.TokenAuthentication',
|
||||
),
|
||||
}
|
||||
|
||||
if not DEBUG:
|
||||
REST_FRAMEWORK.update({
|
||||
'DEFAULT_RENDERER_CLASSES': (
|
||||
'rest_framework.renderers.JSONRenderer',
|
||||
)
|
||||
})
|
||||
|
||||
SALT_USERNAME = '${SALT_USER}'
|
||||
SALT_PASSWORD = '${SALT_PASS}'
|
||||
SALT_HOST = '${SALT_HOST}'
|
||||
MESH_USERNAME = '${MESH_USER}'
|
||||
MESH_SITE = 'https://${MESH_HOST}'
|
||||
MESH_TOKEN_KEY = '${MESH_TOKEN}'
|
||||
REDIS_HOST = '${REDIS_HOST}'
|
||||
MESH_WS_URL = 'ws://${MESH_CONTAINER}:443'
|
||||
EOF
|
||||
)"
|
||||
|
||||
echo "${localvars}" > ${TACTICAL_DIR}/api/tacticalrmm/local_settings.py
|
||||
|
||||
# run migrations and init scripts
|
||||
python manage.py migrate --no-input
|
||||
python manage.py collectstatic --no-input
|
||||
python manage.py initial_db_setup
|
||||
python manage.py initial_mesh_setup
|
||||
python manage.py load_chocos
|
||||
python manage.py load_community_scripts
|
||||
python manage.py reload_nats
|
||||
|
||||
# create super user
|
||||
echo "from accounts.models import User; User.objects.create_superuser('${TRMM_USER}', 'admin@example.com', '${TRMM_PASS}') if not User.objects.filter(username='${TRMM_USER}').exists() else 0;" | python manage.py shell
|
||||
|
||||
# chown everything to tactical user
|
||||
chown -R "${TACTICAL_USER}":"${TACTICAL_USER}" "${TACTICAL_DIR}"
|
||||
|
||||
# create install ready file
|
||||
su -c "echo 'tactical-init' > ${TACTICAL_READY_FILE}" "${TACTICAL_USER}"
|
||||
|
||||
fi
|
||||
|
||||
# backend container
|
||||
if [ "$1" = 'tactical-backend' ]; then
|
||||
check_tactical_ready
|
||||
|
||||
# Prepare log files and start outputting logs to stdout
|
||||
mkdir -p ${TACTICAL_DIR}/api/tacticalrmm/logs
|
||||
touch ${TACTICAL_DIR}/api/tacticalrmm/logs/gunicorn.log
|
||||
touch ${TACTICAL_DIR}/api/tacticalrmm/logs/gunicorn-access.log
|
||||
tail -n 0 -f ${TACTICAL_DIR}/api/tacticalrmm/logs/gunicorn*.log &
|
||||
|
||||
export DJANGO_SETTINGS_MODULE=tacticalrmm.settings
|
||||
|
||||
exec gunicorn tacticalrmm.wsgi:application \
|
||||
--name tactical-backend \
|
||||
--bind 0.0.0.0:80 \
|
||||
--workers 5 \
|
||||
--log-level=info \
|
||||
--log-file=${TACTICAL_DIR}/api/tacticalrmm/logs/gunicorn.log \
|
||||
--access-logfile=${TACTICAL_DIR}/api/tacticalrmm/logs/gunicorn-access.log \
|
||||
|
||||
fi
|
||||
|
||||
if [ "$1" = 'tactical-celery' ]; then
|
||||
check_tactical_ready
|
||||
celery -A tacticalrmm worker
|
||||
fi
|
||||
|
||||
if [ "$1" = 'tactical-celerybeat' ]; then
|
||||
check_tactical_ready
|
||||
test -f "${TACTICAL_DIR}/api/celerybeat.pid" && rm "${TACTICAL_DIR}/api/celerybeat.pid"
|
||||
celery -A tacticalrmm beat
|
||||
fi
|
||||
|
||||
if [ "$1" = 'tactical-celerywinupdate' ]; then
|
||||
check_tactical_ready
|
||||
celery -A tacticalrmm worker -Q wupdate
|
||||
fi
|
||||
@@ -1,99 +0,0 @@
|
||||
# FOR DEV
|
||||
version: "3.7"
|
||||
|
||||
services:
|
||||
# Container that hosts Vue frontend
|
||||
app:
|
||||
image: node:12
|
||||
command: /bin/bash -c "npm install && npm run serve -- --host 0.0.0.0 --port 80 --public ${APP_HOST}"
|
||||
working_dir: /home/node
|
||||
volumes:
|
||||
- ../web:/home/node
|
||||
networks:
|
||||
- proxy
|
||||
|
||||
# Builds Python Virtual Env to share between containers
|
||||
venv:
|
||||
image: python:3.8
|
||||
command: /bin/bash -c "pip install virtualenv && python -m virtualenv env && ./env/bin/pip install -r requirements.txt && ./env/bin/pip install -r requirements-dev.txt"
|
||||
working_dir: /app
|
||||
volumes:
|
||||
- ../api/tacticalrmm:/app
|
||||
|
||||
# Container for Django backend
|
||||
api:
|
||||
image: python:3.8
|
||||
command: /bin/bash -c "python manage.py collectstatic --clear --no-input && python manage.py migrate && sleep 10s && python manage.py initial_db_setup && python manage.py initial_mesh_setup && python manage.py load_chocos && python manage.py runserver 0.0.0.0:80"
|
||||
working_dir: /app
|
||||
environment:
|
||||
VIRTUAL_ENV: /app/env
|
||||
PATH: /app/env/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
networks:
|
||||
- proxy
|
||||
- database
|
||||
- redis
|
||||
volumes:
|
||||
- scripts:/srv
|
||||
- mesh_token:/token
|
||||
- ../api/tacticalrmm:/app
|
||||
depends_on:
|
||||
- db
|
||||
- venv
|
||||
- meshcentral
|
||||
|
||||
# Container for Celery worker service
|
||||
celery-service:
|
||||
image: python:3.8
|
||||
command: celery -A tacticalrmm worker -l debug
|
||||
working_dir: /app
|
||||
environment:
|
||||
VIRTUAL_ENV: /app/env
|
||||
PATH: /app/env/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- ../api/tacticalrmm:/app
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
- venv
|
||||
|
||||
# Container for Celery beat service
|
||||
celery-beat:
|
||||
image: python:3.8
|
||||
command: celery -A tacticalrmm beat -l debug
|
||||
working_dir: /app
|
||||
environment:
|
||||
VIRTUAL_ENV: /app/env
|
||||
PATH: /app/env/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- ../api/tacticalrmm:/app
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
- venv
|
||||
|
||||
# Container for Celery Winupdate tasks
|
||||
celery-winupdate:
|
||||
image: python:3.8
|
||||
command: celery -A tacticalrmm worker -Q wupdate -l debug
|
||||
working_dir: /app
|
||||
environment:
|
||||
VIRTUAL_ENV: /app/env
|
||||
PATH: /app/env/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- ../api/tacticalrmm:/app
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
- venv
|
||||
@@ -1,143 +0,0 @@
|
||||
# FOR PROD
|
||||
version: "3.7"
|
||||
|
||||
volumes:
|
||||
# Gives access to the debug log for celery tasks
|
||||
debug_log:
|
||||
|
||||
services:
|
||||
# Container that hosts Vue frontend
|
||||
app:
|
||||
build:
|
||||
context: ..
|
||||
args:
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
dockerfile: "./docker/app/dockerfile"
|
||||
networks:
|
||||
- proxy
|
||||
|
||||
# Container for Django backend
|
||||
api:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: "./docker/api/dockerfile"
|
||||
args:
|
||||
- DJANGO_SEKRET=${DJANGO_SEKRET}
|
||||
- DJANGO_DEBUG=${DJANGO_DEBUG}
|
||||
- POSTGRES_USER=${POSTGRES_USER}
|
||||
- POSTGRES_PASS=${POSTGRES_PASS}
|
||||
- POSTGRES_HOST=${POSTGRES_HOST}
|
||||
- SALT_PASS=${SALT_PASS}
|
||||
- SALT_USER=${SALT_USER}
|
||||
- SALT_HOST=${SALT_HOST}
|
||||
- REDIS_HOST=${REDIS_HOST}
|
||||
- MESH_USER=${MESH_USER}
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
- ADMIN_URL=${ADMIN_URL}
|
||||
networks:
|
||||
- proxy
|
||||
- database
|
||||
- redis
|
||||
volumes:
|
||||
- scripts:/srv
|
||||
- mesh_token:/token
|
||||
- debug_log:/app/tacticalrmm/private/log
|
||||
depends_on:
|
||||
- db
|
||||
- meshcentral
|
||||
|
||||
# Container for Celery worker service
|
||||
celery-service:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: "./docker/api/dockerfile"
|
||||
args:
|
||||
- DJANGO_SEKRET=${DJANGO_SEKRET}
|
||||
- DJANGO_DEBUG=${DJANGO_DEBUG}
|
||||
- POSTGRES_USER=${POSTGRES_USER}
|
||||
- POSTGRES_PASS=${POSTGRES_PASS}
|
||||
- POSTGRES_HOST=${POSTGRES_HOST}
|
||||
- SALT_PASS=${SALT_PASS}
|
||||
- SALT_USER=${SALT_USER}
|
||||
- SALT_HOST=${SALT_HOST}
|
||||
- REDIS_HOST=${REDIS_HOST}
|
||||
- MESH_USER=${MESH_USER}
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
- ADMIN_URL=${ADMIN_URL}
|
||||
command: celery -A tacticalrmm worker -l debug
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
volumes:
|
||||
- debug_log:/app/tacticalrmm/private/log
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
|
||||
# Container for Celery beat service
|
||||
celery-beat:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: "./docker/api/dockerfile"
|
||||
args:
|
||||
- DJANGO_SEKRET=${DJANGO_SEKRET}
|
||||
- DJANGO_DEBUG=${DJANGO_DEBUG}
|
||||
- POSTGRES_USER=${POSTGRES_USER}
|
||||
- POSTGRES_PASS=${POSTGRES_PASS}
|
||||
- POSTGRES_HOST=${POSTGRES_HOST}
|
||||
- SALT_PASS=${SALT_PASS}
|
||||
- SALT_USER=${SALT_USER}
|
||||
- SALT_HOST=${SALT_HOST}
|
||||
- REDIS_HOST=${REDIS_HOST}
|
||||
- MESH_USER=${MESH_USER}
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
- ADMIN_URL=${ADMIN_URL}
|
||||
command: celery -A tacticalrmm beat -l debug
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
volumes:
|
||||
- debug_log:/app/tacticalrmm/private/log
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
|
||||
# Container for Celery Winupdate tasks
|
||||
celery-winupdate:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: "./docker/api/dockerfile"
|
||||
args:
|
||||
- DJANGO_SEKRET=${DJANGO_SEKRET}
|
||||
- DJANGO_DEBUG=${DJANGO_DEBUG}
|
||||
- POSTGRES_USER=${POSTGRES_USER}
|
||||
- POSTGRES_PASS=${POSTGRES_PASS}
|
||||
- POSTGRES_HOST=${POSTGRES_HOST}
|
||||
- SALT_PASS=${SALT_PASS}
|
||||
- SALT_USER=${SALT_USER}
|
||||
- SALT_HOST=${SALT_HOST}
|
||||
- REDIS_HOST=${REDIS_HOST}
|
||||
- MESH_USER=${MESH_USER}
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
- ADMIN_URL=${ADMIN_URL}
|
||||
command: celery -A tacticalrmm worker -Q wupdate -l debug
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- database
|
||||
volumes:
|
||||
- debug_log:/app/tacticalrmm/private/log
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
@@ -1,6 +1,6 @@
|
||||
version: "3.7"
|
||||
|
||||
# Userdefined Networks
|
||||
# networks
|
||||
networks:
|
||||
proxy:
|
||||
driver: bridge
|
||||
@@ -8,102 +8,202 @@ networks:
|
||||
driver: default
|
||||
config:
|
||||
- subnet: 172.20.0.0/24
|
||||
database:
|
||||
api-db:
|
||||
redis:
|
||||
mesh-mongodb:
|
||||
mesh-db:
|
||||
|
||||
# Docker managed persistent volumes
|
||||
# docker managed persistent volumes
|
||||
volumes:
|
||||
# Volume for userdefined scripts
|
||||
scripts:
|
||||
# Volume for mesh token initial setup
|
||||
mesh_token:
|
||||
# Used to make the salt data persistent
|
||||
tactical_data:
|
||||
salt_data:
|
||||
# Makes Postgres data persistent
|
||||
postgres_data13:
|
||||
# Makes mesh central data persistent
|
||||
postgres_data:
|
||||
mongo_data:
|
||||
mesh_data:
|
||||
|
||||
services:
|
||||
# Postgres Database for API service
|
||||
db:
|
||||
image: postgres:13
|
||||
# postgres database for api service
|
||||
tactical-postgres:
|
||||
image: postgres:13-alpine
|
||||
restart: always
|
||||
environment:
|
||||
POSTGRES_DB: tacticalrmm
|
||||
POSTGRES_USER: ${POSTGRES_USER}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASS}
|
||||
volumes:
|
||||
- postgres_data13:/var/lib/postgresql/data
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
networks:
|
||||
- database
|
||||
- api-db
|
||||
|
||||
# Redis Container for Celery tasks
|
||||
redis:
|
||||
image: redis
|
||||
# redis container for celery tasks
|
||||
tactical-redis:
|
||||
image: redis:6.0-alpine
|
||||
restart: always
|
||||
networks:
|
||||
- redis
|
||||
|
||||
# Salt Master and API
|
||||
salt:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: ./docker/salt/dockerfile
|
||||
args:
|
||||
- SALT_USER=${SALT_USER}
|
||||
- SALT_PASS=${SALT_PASS}
|
||||
# used to initialize the docker environment
|
||||
tactical-init:
|
||||
image: ${IMAGE_REPO}tactical:${VERSION}
|
||||
restart: on-failure
|
||||
command: ["tactical-init"]
|
||||
environment:
|
||||
POSTGRES_USER: ${POSTGRES_USER}
|
||||
POSTGRES_PASS: ${POSTGRES_PASS}
|
||||
APP_HOST: ${APP_HOST}
|
||||
API_HOST: ${API_HOST}
|
||||
MESH_HOST: ${MESH_HOST}
|
||||
TRMM_USER: ${TRMM_USER}
|
||||
TRMM_PASS: ${TRMM_PASS}
|
||||
depends_on:
|
||||
- tactical-postgres
|
||||
- tactical-meshcentral
|
||||
networks:
|
||||
- api-db
|
||||
- proxy
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
|
||||
# salt master and api
|
||||
tactical-salt:
|
||||
image: ${IMAGE_REPO}tactical-salt:${VERSION}
|
||||
restart: always
|
||||
ports:
|
||||
- "4505:4505"
|
||||
- "4506:4506"
|
||||
volumes:
|
||||
- scripts:/srv
|
||||
- tactical_data:/opt/tactical
|
||||
- salt_data:/etc/salt
|
||||
networks:
|
||||
- proxy
|
||||
|
||||
# nats
|
||||
tactical-nats:
|
||||
image: ${IMAGE_REPO}tactical-nats:${VERSION}
|
||||
restart: always
|
||||
ports:
|
||||
- "4222:4222"
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
networks:
|
||||
proxy:
|
||||
aliases:
|
||||
- ${API_HOST}
|
||||
|
||||
# MeshCentral Container
|
||||
meshcentral:
|
||||
build:
|
||||
context: ./meshcentral
|
||||
args:
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
- MESH_USER=${MESH_USER}
|
||||
- MESH_PASS=${MESH_PASS}
|
||||
- EMAIL_USER=${EMAIL_USER}
|
||||
- MONGODB_USER=${MONGODB_USER}
|
||||
- MONGODB_PASSWORD=${MONGODB_PASSWORD}
|
||||
# meshcentral container
|
||||
tactical-meshcentral:
|
||||
image: ${IMAGE_REPO}tactical-meshcentral:${VERSION}
|
||||
restart: always
|
||||
environment:
|
||||
MESH_HOST: ${MESH_HOST}
|
||||
MESH_USER: ${MESH_USER}
|
||||
MESH_PASS: ${MESH_PASS}
|
||||
MONGODB_USER: ${MONGODB_USER}
|
||||
MONGODB_PASSWORD: ${MONGODB_PASSWORD}
|
||||
networks:
|
||||
- proxy
|
||||
- mesh-mongodb
|
||||
- mesh-db
|
||||
volumes:
|
||||
- mesh_token:/token
|
||||
- tactical_data:/opt/tactical
|
||||
- mesh_data:/home/node/app/meshcentral-data
|
||||
depends_on:
|
||||
- mesh-mongodb
|
||||
- nginx-proxy
|
||||
- tactical-mongodb
|
||||
|
||||
# MongoDB Container for MeshCentral
|
||||
mesh-mongodb:
|
||||
image: mongo
|
||||
# mongodb container for meshcentral
|
||||
tactical-mongodb:
|
||||
image: mongo:4.4
|
||||
restart: always
|
||||
environment:
|
||||
MONGO_INITDB_ROOT_USERNAME: ${MONGODB_USER}
|
||||
MONGO_INITDB_ROOT_PASSWORD: ${MONGODB_PASSWORD}
|
||||
MONGO_INITDB_DATABASE: meshcentral
|
||||
networks:
|
||||
- mesh-mongodb
|
||||
- mesh-db
|
||||
volumes:
|
||||
- mongo_data:/data
|
||||
- mongo_data:/data/db
|
||||
|
||||
# Nginx Container Reverse Proxy that handles all http/https traffic
|
||||
nginx-proxy:
|
||||
build:
|
||||
context: ./nginx-proxy
|
||||
args:
|
||||
- APP_HOST=${APP_HOST}
|
||||
- API_HOST=${API_HOST}
|
||||
- MESH_HOST=${MESH_HOST}
|
||||
ports:
|
||||
- "80:80"
|
||||
- "443:443"
|
||||
# container that hosts vue frontend
|
||||
tactical-frontend:
|
||||
image: ${IMAGE_REPO}tactical-frontend:${VERSION}
|
||||
restart: always
|
||||
networks:
|
||||
- proxy
|
||||
environment:
|
||||
API_HOST: ${API_HOST}
|
||||
|
||||
# container for django backend
|
||||
tactical-backend:
|
||||
image: ${IMAGE_REPO}tactical:${VERSION}
|
||||
command: ["tactical-backend"]
|
||||
restart: always
|
||||
networks:
|
||||
- proxy
|
||||
- api-db
|
||||
- redis
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
depends_on:
|
||||
- tactical-postgres
|
||||
|
||||
tactical-nginx:
|
||||
# container for tactical reverse proxy
|
||||
image: ${IMAGE_REPO}tactical-nginx:${VERSION}
|
||||
restart: always
|
||||
environment:
|
||||
APP_HOST: ${APP_HOST}
|
||||
API_HOST: ${API_HOST}
|
||||
MESH_HOST: ${MESH_HOST}
|
||||
CERT_PUB_KEY: ${CERT_PUB_KEY}
|
||||
CERT_PRIV_KEY: ${CERT_PRIV_KEY}
|
||||
networks:
|
||||
proxy:
|
||||
ipv4_address: 172.20.0.20
|
||||
ports:
|
||||
- "80:80"
|
||||
- "443:443"
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
|
||||
# container for celery worker service
|
||||
tactical-celery:
|
||||
image: ${IMAGE_REPO}tactical:${VERSION}
|
||||
command: ["tactical-celery"]
|
||||
restart: always
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- api-db
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
depends_on:
|
||||
- tactical-postgres
|
||||
- tactical-redis
|
||||
|
||||
# container for celery beat service
|
||||
tactical-celerybeat:
|
||||
image: ${IMAGE_REPO}tactical:${VERSION}
|
||||
command: ["tactical-celerybeat"]
|
||||
restart: always
|
||||
networks:
|
||||
- proxy
|
||||
- redis
|
||||
- api-db
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
depends_on:
|
||||
- tactical-postgres
|
||||
- tactical-redis
|
||||
|
||||
# container for celery winupdate tasks
|
||||
tactical-celerywinupdate:
|
||||
image: ${IMAGE_REPO}tactical:${VERSION}
|
||||
command: ["tactical-celerywinupdate"]
|
||||
restart: always
|
||||
networks:
|
||||
- redis
|
||||
- proxy
|
||||
- api-db
|
||||
volumes:
|
||||
- tactical_data:/opt/tactical
|
||||
depends_on:
|
||||
- tactical-postgres
|
||||
- tactical-redis
|
||||
|
||||
13
docker/image-build.sh
Executable file
13
docker/image-build.sh
Executable file
@@ -0,0 +1,13 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -o errexit
|
||||
set -o pipefail
|
||||
|
||||
DOCKER_IMAGES="tactical tactical-frontend tactical-nginx tactical-meshcentral tactical-salt tactical-nats"
|
||||
|
||||
cd ..
|
||||
|
||||
for DOCKER_IMAGE in ${DOCKER_IMAGES}; do
|
||||
echo "Building Tactical Image: ${DOCKER_IMAGE}..."
|
||||
docker build --pull --no-cache -t "${DOCKER_IMAGE}" -f "docker/containers/${DOCKER_IMAGE}/dockerfile" .
|
||||
done
|
||||
@@ -1,36 +0,0 @@
|
||||
{
|
||||
"settings": {
|
||||
"mongodb": "mongodb://${MONGODB_USER}:${MONGODB_PASSWORD}@mesh-mongodb:27017",
|
||||
"Cert": "${MESH_HOST}",
|
||||
"TLSOffload": "172.20.0.20",
|
||||
"RedirPort": 80,
|
||||
"WANonly": true,
|
||||
"Minify": 1,
|
||||
"Port": 443,
|
||||
"AllowLoginToken": true,
|
||||
"AllowFraming": true,
|
||||
"_AgentPing": 60,
|
||||
"AgentPong": 300,
|
||||
"AllowHighQualityDesktop": true,
|
||||
"MaxInvalidLogin": {
|
||||
"time": 5,
|
||||
"count": 5,
|
||||
"coolofftime": 30
|
||||
}
|
||||
},
|
||||
"domains": {
|
||||
"": {
|
||||
"Title": "Dev RMM",
|
||||
"Title2": "DevRMM",
|
||||
"NewAccounts": false,
|
||||
"mstsc": true,
|
||||
"GeoLocation": true,
|
||||
"CertUrl": "https://172.20.0.20:443",
|
||||
"httpheaders": {
|
||||
"Strict-Transport-Security": "max-age=360000",
|
||||
"_x-frame-options": "sameorigin",
|
||||
"Content-Security-Policy": "default-src 'none'; script-src 'self' 'unsafe-inline'; connect-src 'self'; img-src 'self' data:; style-src 'self' 'unsafe-inline'; frame-src 'self'; media-src 'self'"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,21 +0,0 @@
|
||||
FROM node:stretch
|
||||
|
||||
WORKDIR /home/node/app
|
||||
ARG MESH_HOST
|
||||
ARG MESH_USER
|
||||
ARG MESH_PASS
|
||||
ARG EMAIL_USER
|
||||
ARG MONGODB_USER
|
||||
ARG MONGODB_PASSWORD
|
||||
RUN apt-get update && apt-get install -y gettext-base
|
||||
RUN npm install meshcentral@0.6.62
|
||||
COPY config.json ./meshcentral-data/config.json.tmp
|
||||
RUN envsubst '\$MESH_HOST, \$MONGODB_USER, \$MONGODB_PASSWORD' < /home/node/app/meshcentral-data/config.json.tmp > /home/node/app/meshcentral-data/config.json && \
|
||||
rm /home/node/app/meshcentral-data/config.json.tmp
|
||||
|
||||
COPY entry.sh ./entry.sh.tmp
|
||||
RUN envsubst < /home/node/app/entry.sh.tmp > /home/node/app/entry.sh && \
|
||||
rm /home/node/app/entry.sh.tmp && \
|
||||
chmod +x entry.sh
|
||||
|
||||
CMD ./entry.sh
|
||||
@@ -1,11 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
node node_modules/meshcentral --createaccount ${MESH_USER} --pass ${MESH_PASS} --email ${EMAIL_USER}
|
||||
node node_modules/meshcentral --adminaccount ${MESH_USER}
|
||||
|
||||
FILE=/token/token.key
|
||||
if [ ! -f "$FILE" ]; then
|
||||
node ./node_modules/meshcentral --logintokenkey > /token/token.key
|
||||
fi
|
||||
|
||||
node node_modules/meshcentral
|
||||
@@ -1,41 +0,0 @@
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
server_name ${API_HOST};
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set $api http://api;
|
||||
|
||||
proxy_pass $api;
|
||||
proxy_http_version 1.1;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_set_header X-Forwarded-Host $host;
|
||||
proxy_set_header X-Forwarded-Port $server_port;
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/api-error.log;
|
||||
access_log /var/log/nginx/api-access.log;
|
||||
|
||||
client_max_body_size 300M;
|
||||
|
||||
listen 443 ssl;
|
||||
ssl_certificate /cert/fullchain.pem;
|
||||
ssl_certificate_key /cert/privkey.pem;
|
||||
ssl_ciphers 'ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256';
|
||||
|
||||
}
|
||||
|
||||
server {
|
||||
|
||||
listen 80;
|
||||
server_name ${API_HOST};
|
||||
return 301 https://$server_name$request_uri;
|
||||
}
|
||||
@@ -1,40 +0,0 @@
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
server_name ${APP_HOST};
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set $app http://app;
|
||||
|
||||
proxy_pass $app;
|
||||
proxy_http_version 1.1;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_set_header X-Forwarded-Host $host;
|
||||
proxy_set_header X-Forwarded-Port $server_port;
|
||||
}
|
||||
|
||||
error_log /var/log/nginx/app-error.log;
|
||||
access_log /var/log/nginx/app-access.log;
|
||||
|
||||
listen 443 ssl;
|
||||
ssl_certificate /cert/fullchain.pem;
|
||||
ssl_certificate_key /cert/privkey.pem;
|
||||
ssl_ciphers 'ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256';
|
||||
|
||||
}
|
||||
|
||||
server {
|
||||
|
||||
listen 80;
|
||||
server_name ${APP_HOST};
|
||||
return 301 https://$server_name$request_uri;
|
||||
}
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
FROM nginx
|
||||
WORKDIR /etc/nginx/conf.d
|
||||
|
||||
ARG APP_HOST
|
||||
ARG API_HOST
|
||||
ARG MESH_HOST
|
||||
|
||||
EXPOSE 80
|
||||
EXPOSE 443
|
||||
|
||||
#Remove default NGINX config
|
||||
RUN rm /etc/nginx/conf.d/default.conf
|
||||
|
||||
#Copy APP config
|
||||
COPY app.conf ./app.conf.tmp
|
||||
RUN envsubst '\$APP_HOST' < /etc/nginx/conf.d/app.conf.tmp > /etc/nginx/conf.d/app.conf && rm /etc/nginx/conf.d/app.conf.tmp
|
||||
|
||||
#Copy API config
|
||||
COPY api.conf ./api.conf.tmp
|
||||
RUN envsubst '\$API_HOST' < /etc/nginx/conf.d/api.conf.tmp > /etc/nginx/conf.d/api.conf && rm /etc/nginx/conf.d/api.conf.tmp
|
||||
|
||||
#Copy Mesh config
|
||||
COPY mesh.conf ./mesh.conf.tmp
|
||||
RUN envsubst '\$MESH_HOST' < /etc/nginx/conf.d/mesh.conf.tmp > /etc/nginx/conf.d/mesh.conf && rm /etc/nginx/conf.d/mesh.conf.tmp
|
||||
|
||||
#Copy Certs
|
||||
COPY ./cert/*.pem /cert/
|
||||
@@ -1,39 +0,0 @@
|
||||
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
listen 443 ssl;
|
||||
proxy_send_timeout 330s;
|
||||
proxy_read_timeout 330s;
|
||||
server_name ${MESH_HOST};
|
||||
ssl_certificate /cert/fullchain.pem;
|
||||
ssl_certificate_key /cert/privkey.pem;
|
||||
ssl_session_cache shared:WEBSSL:10m;
|
||||
ssl_ciphers HIGH:!aNULL:!MD5;
|
||||
ssl_prefer_server_ciphers on;
|
||||
|
||||
location / {
|
||||
#Using variable to disable start checks
|
||||
set $meshcentral http://meshcentral:443;
|
||||
|
||||
proxy_pass $meshcentral;
|
||||
proxy_http_version 1.1;
|
||||
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-Host $host:$server_port;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
}
|
||||
|
||||
server {
|
||||
resolver 127.0.0.11 valid=30s;
|
||||
|
||||
listen 80;
|
||||
server_name ${MESH_HOST};
|
||||
return 301 https://$server_name$request_uri;
|
||||
}
|
||||
107
docker/readme.md
107
docker/readme.md
@@ -1,91 +1,72 @@
|
||||
# Docker Setup
|
||||
|
||||
- install docker and docker-compose
|
||||
- Obtain wildcard cert or individual certs for each subdomain
|
||||
- You can copy any wildcard cert public and private key to the docker/nginx-proxy/certs folder.
|
||||
|
||||
## Generate certificates with certbot (Optional if you already have the certs)
|
||||
- Install docker and docker-compose
|
||||
- Obtain valid wildcard certificate for your domain. If certificates are not provided, a self-signed certificate will be generated and most agent functions won't work. See below on how to generate a free Let's Encrypt!
|
||||
|
||||
## Generate certificates with certbot
|
||||
Install Certbot
|
||||
|
||||
```
|
||||
sudo add-apt-repository ppa:certbot/certbot
|
||||
sudo apt-get install certbot
|
||||
```
|
||||
|
||||
Generate the wildcard certificate. Add the DNS entry for domain validation.
|
||||
Generate the wildcard certificate. Add the DNS entry for domain validation. Replace `example.com` with your root doamin
|
||||
|
||||
```
|
||||
sudo certbot certonly --manual -d *.example.com --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns
|
||||
```
|
||||
Copy the fullchain.pem and privkey.pem to the nginx-proxy/cert directory.
|
||||
|
||||
## Configure DNS and Firewall
|
||||
## Configure DNS and firewall
|
||||
|
||||
You will need to add DNS entries so that the three subdomains resolve to the IP of the docker host. There is a reverse proxy running that will route the hostnames to the correct container. On the host, you will need to ensure the firewall is open on tcp ports 80, 443, 4505, 4506.
|
||||
You will need to add DNS entries so that the three subdomains resolve to the IP of the docker host. There is a reverse proxy running that will route the hostnames to the correct container. On the host, you will need to ensure the firewall is open on tcp ports 80, 443, 4222, 4505, 4506.
|
||||
|
||||
## Run the environment with Docker
|
||||
## Setting up the environment
|
||||
|
||||
Copy the .env.example to .env then
|
||||
change values in .env to match your environment
|
||||
Get the docker-compose and .env.example file on the host you which to install on
|
||||
|
||||
```
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/docker/docker-compose.yml
|
||||
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/docker/.env.example
|
||||
mv .env.example .env
|
||||
```
|
||||
|
||||
Change the values in .env to match your environment.
|
||||
|
||||
If you are supplying certificates through Let's Encrypt or another source, see the section below about base64 encoding the certificate files.
|
||||
|
||||
## Base64 encoding certificates to pass as env variables
|
||||
|
||||
Use the below command to add the the correct values to the .env.
|
||||
|
||||
Running this command multiple times will add redundant entries, so those will need to be removed.
|
||||
|
||||
Let's encrypt certs paths are below. Replace ${rootdomain} with your own.
|
||||
|
||||
public key
|
||||
`/etc/letsencrypt/live/${rootdomain}/fullchain.pem`
|
||||
|
||||
private key
|
||||
`/etc/letsencrypt/live/${rootdomain}/privkey.pem`
|
||||
|
||||
```
|
||||
echo "CERT_PUB_KEY=$(sudo base64 -w 0 /path/to/pub/key)" >> .env
|
||||
echo "CERT_PRIV_KEY=$(sudo base64 -w 0 /path/to/priv/key)" >> .env
|
||||
```
|
||||
|
||||
## Starting the environment
|
||||
|
||||
Run the below command to start the environment.
|
||||
|
||||
```
|
||||
cd docker
|
||||
sudo docker-compose up -d
|
||||
```
|
||||
|
||||
You may need to run this twice if some containers fail to start
|
||||
|
||||
## Create a super user
|
||||
|
||||
```
|
||||
sudo docker-compose exec api python manage.py createsuperuser
|
||||
```
|
||||
Removing the -d will start the containers in the foreground and is useful for debugging.
|
||||
|
||||
## Get MeshCentral EXE download link
|
||||
|
||||
Run the below command to get the download link for the mesh central exe. The dashboard will ask for this when you first sign in
|
||||
Run the below command to get the download link for the mesh central exe. This needs to be uploaded on first successful signin.
|
||||
|
||||
```
|
||||
sudo docker-compose exec api python manage.py get_mesh_exe_url
|
||||
sudo docker-compose exec tactical-backend python manage.py get_mesh_exe_url
|
||||
```
|
||||
|
||||
## Connect to a container instance shell
|
||||
|
||||
The below command opens up a shell to the api service.
|
||||
|
||||
```
|
||||
sudo docker-compose exec api /bin/bash
|
||||
```
|
||||
|
||||
If /bin/bash doesn't work then /bin/sh might need to be used.
|
||||
|
||||
## Using Docker for Dev (optional)
|
||||
|
||||
This allows you to edit the files locally and those changes will be presented to the containers. Hot Module Reload (Vue/webpack) and the Python equivalent will also work!
|
||||
|
||||
### Setup
|
||||
|
||||
Files that need to be manually created are:
|
||||
- api/tacticalrmm/tacticalrmm/local_settings.py
|
||||
- web/.env
|
||||
|
||||
Make sure to add `MESH_WS_URL="ws://meshcentral:443"` in the local_settings.py file. This is needed for the mesh central setup
|
||||
|
||||
For HMR to work with vue you can copy .env.example and modify the setting to fit your dev environment.
|
||||
|
||||
### Create Python Virtual Env
|
||||
|
||||
Each python container shares the same virtual env to make spinning up faster. It is located in api/tacticalrmm/env.
|
||||
|
||||
There is a container dedicated to creating and keeping this up to date. Prior to spinning up the environment you can run `docker-compose -f docker-compose.yml -f docker-compose.dev.yml up venv` to make sure the virtual env is ready. Otherwise the api and celery containers will fail to start.
|
||||
|
||||
### Spinup the environment
|
||||
|
||||
Now run `docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d` to spin everything else up
|
||||
|
||||
This will mount the local vue and python files in the app container with hot reload. Does not require rebuilding when changes to code are made and the changes will take effect immediately!
|
||||
|
||||
### Other Considerations
|
||||
|
||||
- It is recommended that you use the vscode docker plugin to manage containers. Docker desktop works well too on Windows.
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
timeout: 20
|
||||
gather_job_timeout: 25
|
||||
max_event_size: 30485760
|
||||
external_auth:
|
||||
pam:
|
||||
${SALT_USER}:
|
||||
- .*
|
||||
- '@runner'
|
||||
- '@wheel'
|
||||
- '@jobs'
|
||||
rest_cherrypy:
|
||||
port: 8123
|
||||
disable_ssl: True
|
||||
max_request_body_size: 30485760
|
||||
@@ -1,30 +0,0 @@
|
||||
FROM ubuntu:20.04
|
||||
|
||||
ARG SALT_USER
|
||||
ARG SALT_PASS
|
||||
|
||||
RUN adduser --no-create-home --disabled-password --gecos "" ${SALT_USER} && \
|
||||
echo "${SALT_USER}:${SALT_PASS}" | chpasswd && \
|
||||
apt-get update && \
|
||||
apt-get install -y ca-certificates wget gnupg2 gettext-base tzdata supervisor && \
|
||||
wget -O - https://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest/SALTSTACK-GPG-KEY.pub | apt-key add - && \
|
||||
echo 'deb http://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest focal main' | tee /etc/apt/sources.list.d/saltstack.list && \
|
||||
apt-get update && \
|
||||
apt-get install -y salt-master salt-api && \
|
||||
mkdir -p /var/log/supervisor
|
||||
|
||||
COPY ./docker/salt/api.conf /etc/salt/master.d/rmm-salt.tmp
|
||||
RUN envsubst '\$SALT_USER' < /etc/salt/master.d/rmm-salt.tmp | tee /etc/salt/master.d/rmm-salt.conf && \
|
||||
rm /etc/salt/master.d/rmm-salt.tmp
|
||||
|
||||
RUN sed -i 's/msgpack_kwargs = {"raw": six.PY2}/msgpack_kwargs = {"raw": six.PY2, "max_buffer_size": 2147483647}/g' /usr/lib/python3/dist-packages/salt/transport/ipc.py
|
||||
COPY ./docker/salt/supervisor.conf /etc/supervisor/conf.d/supervisor.conf
|
||||
|
||||
COPY ./_modules /srv/salt/_modules
|
||||
COPY ./scripts /srv/salt/scripts
|
||||
RUN mkdir -p /srv/salt/scripts/userdefined && \
|
||||
chown -R 1000:102 /srv/salt/scripts/userdefined && \
|
||||
chmod -R 771 /srv/salt/scripts/userdefined
|
||||
|
||||
EXPOSE 8123 4505 4506
|
||||
CMD ["/usr/bin/supervisord"]
|
||||
@@ -1,12 +0,0 @@
|
||||
[supervisord]
|
||||
nodaemon=true
|
||||
[include]
|
||||
files = /etc/supervisor/conf.d/*.conf
|
||||
|
||||
[program:salt-master]
|
||||
command=/bin/bash -c "salt-master -l debug"
|
||||
redirect_stderr=true
|
||||
|
||||
[program:salt-api]
|
||||
command=/bin/bash -c "salt-api -l debug"
|
||||
redirect_stderr=true
|
||||
12
docs/.npmignore
Executable file
12
docs/.npmignore
Executable file
@@ -0,0 +1,12 @@
|
||||
pids
|
||||
logs
|
||||
node_modules
|
||||
npm-debug.log
|
||||
coverage/
|
||||
run
|
||||
dist
|
||||
.DS_Store
|
||||
.nyc_output
|
||||
.basement
|
||||
config.local.js
|
||||
basement_dist
|
||||
41
docs/.vuepress/config.js
Executable file
41
docs/.vuepress/config.js
Executable file
@@ -0,0 +1,41 @@
|
||||
const { description } = require('../package')
|
||||
|
||||
module.exports = {
|
||||
base: '/tacticalrmm/',
|
||||
title: 'Tactical RMM',
|
||||
description: description,
|
||||
|
||||
head: [
|
||||
['meta', { name: 'theme-color', content: '#3eaf7c' }],
|
||||
['meta', { name: 'apple-mobile-web-app-capable', content: 'yes' }],
|
||||
['meta', { name: 'apple-mobile-web-app-status-bar-style', content: 'black' }]
|
||||
],
|
||||
themeConfig: {
|
||||
repo: '',
|
||||
editLinks: false,
|
||||
docsDir: '',
|
||||
editLinkText: '',
|
||||
lastUpdated: false,
|
||||
nav: [
|
||||
{
|
||||
text: 'Guide',
|
||||
link: '/guide/',
|
||||
}
|
||||
],
|
||||
sidebar: {
|
||||
'/guide/': [
|
||||
{
|
||||
title: 'Guide',
|
||||
collapsable: false,
|
||||
children: [
|
||||
'',
|
||||
]
|
||||
}
|
||||
],
|
||||
}
|
||||
},
|
||||
plugins: [
|
||||
//'@vuepress/plugin-back-to-top',
|
||||
//'@vuepress/plugin-medium-zoom',
|
||||
]
|
||||
}
|
||||
14
docs/.vuepress/enhanceApp.js
Executable file
14
docs/.vuepress/enhanceApp.js
Executable file
@@ -0,0 +1,14 @@
|
||||
/**
|
||||
* Client app enhancement file.
|
||||
*
|
||||
* https://v1.vuepress.vuejs.org/guide/basic-config.html#app-level-enhancements
|
||||
*/
|
||||
|
||||
export default ({
|
||||
Vue, // the version of Vue being used in the VuePress app
|
||||
options, // the options for the root Vue instance
|
||||
router, // the router instance for the app
|
||||
siteData // site metadata
|
||||
}) => {
|
||||
// ...apply enhancements for the site.
|
||||
}
|
||||
8
docs/.vuepress/styles/index.styl
Executable file
8
docs/.vuepress/styles/index.styl
Executable file
@@ -0,0 +1,8 @@
|
||||
/**
|
||||
* Custom Styles here.
|
||||
*
|
||||
* ref:https://v1.vuepress.vuejs.org/config/#index-styl
|
||||
*/
|
||||
|
||||
.home .hero img
|
||||
max-width 450px!important
|
||||
10
docs/.vuepress/styles/palette.styl
Executable file
10
docs/.vuepress/styles/palette.styl
Executable file
@@ -0,0 +1,10 @@
|
||||
/**
|
||||
* Custom palette here.
|
||||
*
|
||||
* ref:https://v1.vuepress.vuejs.org/zh/config/#palette-styl
|
||||
*/
|
||||
|
||||
$accentColor = #3eaf7c
|
||||
$textColor = #2c3e50
|
||||
$borderColor = #eaecef
|
||||
$codeBgColor = #282c34
|
||||
14
docs/deploy.sh
Executable file
14
docs/deploy.sh
Executable file
@@ -0,0 +1,14 @@
|
||||
#!/usr/bin/env sh
|
||||
|
||||
set -e
|
||||
|
||||
npm run build
|
||||
|
||||
cd .vuepress/dist
|
||||
|
||||
git init
|
||||
git add -A
|
||||
git commit -m 'deploy'
|
||||
|
||||
git push -f git@github.com:wh1te909/tacticalrmm.git develop:gh-pages
|
||||
cd -
|
||||
2
docs/guide/README.md
Executable file
2
docs/guide/README.md
Executable file
@@ -0,0 +1,2 @@
|
||||
# Installation
|
||||
|
||||
6
docs/index.md
Executable file
6
docs/index.md
Executable file
@@ -0,0 +1,6 @@
|
||||
---
|
||||
home: true
|
||||
heroImage: https://v1.vuepress.vuejs.org/hero.png
|
||||
actionText: Documentation →
|
||||
actionLink: /guide/
|
||||
---
|
||||
10783
docs/package-lock.json
generated
Normal file
10783
docs/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user