mirror of
https://github.com/wazuh/wazuh-docker.git
synced 2025-11-04 14:03:24 +00:00
Compare commits
1 Commits
cloud-2.0.
...
2.0_5.4.2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
78bf058a9f |
219
CHANGELOG.md
219
CHANGELOG.md
@@ -1,219 +0,0 @@
|
|||||||
# Change Log
|
|
||||||
All notable changes to this project will be documented in this file.
|
|
||||||
|
|
||||||
## Wazuh Docker v3.10.2_7.3.2
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.10.2_7.3.2
|
|
||||||
|
|
||||||
## Wazuh Docker v3.10.0_7.3.2
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.10.0_7.3.2
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.5_7.2.1
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.5_7.2.1
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.4_7.2.0
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.4_7.2.0
|
|
||||||
- Implemented Wazuh Filebeat Module ([jm404](https://www.github.com/jm404)) [#2a77c6a](https://github.com/wazuh/wazuh-docker/commit/2a77c6a6e6bf78f2492adeedbade7a507d9974b2)
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.3_7.2.0
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
- Wazuh-docker reinserts cluster settings after resuming containers ([@manuasir](https://github.com/manuasir)) [#213](https://github.com/wazuh/wazuh-docker/pull/213)
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.2_7.1.1
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.2_7.1.1
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.3_6.8.1
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.3_6.8.1
|
|
||||||
- Option to disable additionals X-Pack applications and hide unnecesary management links ([@SitoRBJ](https://github.com/SitoRBJ)) ([#163](https://github.com/wazuh/wazuh-docker/pull/163))
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.2_6.8.0
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.2_6.8.0
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.1_7.1.0
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Support for Elastic v7.1.0
|
|
||||||
- New environment variables for Kibana ([@manuasir](https://github.com/manuasir)) [#22ad43](https://github.com/wazuh/wazuh-docker/commit/22ad4360f548e54bb0c5e929f8c84a186ad2ab88)
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.1_6.8.0
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.9.1_6.8.0 ([#181](https://github.com/wazuh/wazuh-docker/pull/181))
|
|
||||||
- Security for Elastic Stack in Docker implemented ([#186](https://github.com/wazuh/wazuh-docker/issues/186))
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
|
|
||||||
- Fixed `ELASTICSEARCH_KIBANA_IP` environment variable ([@manuasir](https://github.com/manuasir)) ([#181](https://github.com/wazuh/wazuh-docker/pull/181))
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.1_7.1.0
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Support for Elastic v7.1.0
|
|
||||||
- New environment variables for Kibana ([@manuasir](https://github.com/manuasir)) [#22ad43](https://github.com/wazuh/wazuh-docker/commit/22ad4360f548e54bb0c5e929f8c84a186ad2ab88)
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.0_6.7.2
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update Elastic Stack version to 6.7.2.
|
|
||||||
|
|
||||||
## Wazuh Docker v3.9.0_6.7.1
|
|
||||||
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Support for xPACK authorized requests ([@manuasir](https://github.com/manuasir)) ([#119](https://github.com/wazuh/wazuh-docker/pull/119))
|
|
||||||
- Add Elasticsearch cluster configuration ([@SitoRBJ](https://github.com/SitoRBJ)). ([#146](https://github.com/wazuh/wazuh-docker/pull/146))
|
|
||||||
- Add Elasticsearch cluster configuration ([@Phandora](https://github.com/Phandora)) ([#140](https://github.com/wazuh/wazuh-docker/pull/140))
|
|
||||||
- Setting Nginx to support several user/passwords in Kibana ([@toniMR](https://github.com/toniMR)) ([#136](https://github.com/wazuh/wazuh-docker/pull/136))
|
|
||||||
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Use LS_JAVA_OPTS instead of old LS_HEAP_SIZE ([@ruffy91](https://github.com/ruffy91)) ([#139](https://github.com/wazuh/wazuh-docker/pull/139))
|
|
||||||
- Changing the original Wazuh docker image to allow adding code in the entrypoint ([@Phandora](https://github.com/phandora)) ([#151](https://github.com/wazuh/wazuh-docker/pull/151))
|
|
||||||
|
|
||||||
### Removed
|
|
||||||
|
|
||||||
- Removing files from Wazuh image ([@Phandora](https://github.com/phandora)) ([#153](https://github.com/wazuh/wazuh-docker/pull/153))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.2_6.7.0
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update Elastic Stack version to 6.7.0. ([#144](https://github.com/wazuh/wazuh-docker/pull/144))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.2_6.6.2
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update Elastic Stack version to 6.6.2. ([#130](https://github.com/wazuh/wazuh-docker/pull/130))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.2_6.6.1
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update Elastic Stack version to 6.6.1. ([#129](https://github.com/wazuh/wazuh-docker/pull/129))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.2_6.5.4
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Add Wazuh-Elasticsearch. ([#106](https://github.com/wazuh/wazuh-docker/pull/106))
|
|
||||||
- Store Filebeat _/var/lib/filebeat/registry._ ([#109](https://github.com/wazuh/wazuh-docker/pull/109))
|
|
||||||
- Adding the option to disable some xpack features. ([#111](https://github.com/wazuh/wazuh-docker/pull/111))
|
|
||||||
- Wazuh-Kibana customizable at plugin level. ([#117](https://github.com/wazuh/wazuh-docker/pull/117))
|
|
||||||
- Adding env variables for alerts data flow. ([#118](https://github.com/wazuh/wazuh-docker/pull/118))
|
|
||||||
- New Logstash entrypoint added. ([#135](https://github.com/wazuh/wazuh-docker/pull/135/files))
|
|
||||||
- Welcome screen management. ([#133](https://github.com/wazuh/wazuh-docker/pull/133))
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update to Wazuh version 3.8.2. ([#105](https://github.com/wazuh/wazuh-docker/pull/105))
|
|
||||||
|
|
||||||
### Removed
|
|
||||||
|
|
||||||
- Remove alerts created in build time. ([#137](https://github.com/wazuh/wazuh-docker/pull/137))
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.1_6.5.4
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
- Update to Wazuh version 3.8.1. ([#102](https://github.com/wazuh/wazuh-docker/pull/102))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.8.0_6.5.4
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Upgrade version 3.8.0_6.5.4. ([#97](https://github.com/wazuh/wazuh-docker/pull/97))
|
|
||||||
|
|
||||||
### Removed
|
|
||||||
|
|
||||||
- Remove cluster.py work around. ([#99](https://github.com/wazuh/wazuh-docker/pull/99))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.7.2_6.5.4
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Improvements to Kibana settings added. ([#91](https://github.com/wazuh/wazuh-docker/pull/91))
|
|
||||||
- Add Kibana environmental variables for Wazuh APP config.yml. ([#89](https://github.com/wazuh/wazuh-docker/pull/89))
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Update Elastic Stack version to 6.5.4. ([#82](https://github.com/wazuh/wazuh-docker/pull/82))
|
|
||||||
- Add env credentials for nginx. ([#86](https://github.com/wazuh/wazuh-docker/pull/86))
|
|
||||||
- Improve filebeat configuration ([#88](https://github.com/wazuh/wazuh-docker/pull/88))
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
|
|
||||||
- Temporary fix for Wazuh cluster master node in Kubernetes. ([#84](https://github.com/wazuh/wazuh-docker/pull/84))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.7.2_6.5.3
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Erasing temporary fix for AWS integration. ([#81](https://github.com/wazuh/wazuh-docker/pull/81))
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
|
|
||||||
- Upgrading errors due to wrong files. ([#80](https://github.com/wazuh/wazuh-docker/pull/80))
|
|
||||||
|
|
||||||
|
|
||||||
## Wazuh Docker v3.7.0_6.5.0
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Adapt to Elastic stack 6.5.0.
|
|
||||||
|
|
||||||
## Wazuh Docker v3.7.0_6.4.3
|
|
||||||
|
|
||||||
### Added
|
|
||||||
|
|
||||||
- Allow custom scripts or commands before service start ([#58](https://github.com/wazuh/wazuh-docker/pull/58))
|
|
||||||
- Added description for wazuh-nginx ([#59](https://github.com/wazuh/wazuh-docker/pull/59))
|
|
||||||
- Added license file to match https://github.com/wazuh/wazuh LICENSE ([#60](https://github.com/wazuh/wazuh-docker/pull/60))
|
|
||||||
- Added SMTP packages ([#67](https://github.com/wazuh/wazuh-docker/pull/67))
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
|
|
||||||
- Increased proxy buffer for NGINX Kibana ([#51](https://github.com/wazuh/wazuh-docker/pull/51))
|
|
||||||
- Updated logstash config to remove deprecation warnings ([#55](https://github.com/wazuh/wazuh-docker/pull/55))
|
|
||||||
- Set ossec user's home path ([#61](https://github.com/wazuh/wazuh-docker/pull/61))
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
|
|
||||||
- Fixed a bug that prevents the API from starting when the Wazuh manager was updated. Change in the files that are stored in the volume. ([#65](https://github.com/wazuh/wazuh-docker/pull/65))
|
|
||||||
- Fixed script reference ([#62](https://github.com/wazuh/wazuh-docker/pull/62/files))
|
|
||||||
|
|
||||||
## Wazuh Docker v3.6.1_6.4.3
|
|
||||||
|
|
||||||
Wazuh-Docker starting point.
|
|
||||||
475
LICENSE
475
LICENSE
@@ -1,475 +0,0 @@
|
|||||||
|
|
||||||
Portions Copyright (C) 2019 Wazuh, Inc.
|
|
||||||
Based on work Copyright (C) 2003 - 2013 Trend Micro, Inc.
|
|
||||||
|
|
||||||
This program is a free software; you can redistribute it and/or modify
|
|
||||||
it under the terms of the GNU General Public License (version 2) as
|
|
||||||
published by the FSF - Free Software Foundation.
|
|
||||||
|
|
||||||
In addition, certain source files in this program permit linking with the
|
|
||||||
OpenSSL library (http://www.openssl.org), which otherwise wouldn't be allowed
|
|
||||||
under the GPL. For purposes of identifying OpenSSL, most source files giving
|
|
||||||
this permission limit it to versions of OpenSSL having a license identical to
|
|
||||||
that listed in this file (see section "OpenSSL LICENSE" below). It is not
|
|
||||||
necessary for the copyright years to match between this file and the OpenSSL
|
|
||||||
version in question. However, note that because this file is an extension of
|
|
||||||
the license statements of these source files, this file may not be changed
|
|
||||||
except with permission from all copyright holders of source files in this
|
|
||||||
program which reference this file.
|
|
||||||
|
|
||||||
Note that this license applies to the source code, as well as
|
|
||||||
decoders, rules and any other data file included with OSSEC (unless
|
|
||||||
otherwise specified).
|
|
||||||
|
|
||||||
For the purpose of this license, we consider an application to constitute a
|
|
||||||
"derivative work" or a work based on this program if it does any of the
|
|
||||||
following (list not exclusive):
|
|
||||||
|
|
||||||
* Integrates source code/data files from OSSEC.
|
|
||||||
* Includes OSSEC copyrighted material.
|
|
||||||
* Includes/integrates OSSEC into a proprietary executable installer.
|
|
||||||
* Links to a library or executes a program that does any of the above.
|
|
||||||
|
|
||||||
This list is not exclusive, but just a clarification of our interpretation
|
|
||||||
of derived works. These restrictions only apply if you actually redistribute
|
|
||||||
OSSEC (or parts of it).
|
|
||||||
|
|
||||||
We don't consider these to be added restrictions on top of the GPL,
|
|
||||||
but just a clarification of how we interpret "derived works" as it
|
|
||||||
applies to OSSEC. This is similar to the way Linus Torvalds has
|
|
||||||
announced his interpretation of how "derived works" applies to Linux kernel
|
|
||||||
modules. Our interpretation refers only to OSSEC - we don't speak
|
|
||||||
for any other GPL products.
|
|
||||||
|
|
||||||
* As a special exception, the copyright holders give
|
|
||||||
* permission to link the code of portions of this program with the
|
|
||||||
* OpenSSL library under certain conditions as described in each
|
|
||||||
* individual source file, and distribute linked combinations
|
|
||||||
* including the two.
|
|
||||||
* You must obey the GNU General Public License in all respects
|
|
||||||
* for all of the code used other than OpenSSL. If you modify
|
|
||||||
* file(s) with this exception, you may extend this exception to your
|
|
||||||
* version of the file(s), but you are not obligated to do so. If you
|
|
||||||
* do not wish to do so, delete this exception statement from your
|
|
||||||
* version. If you delete this exception statement from all source
|
|
||||||
* files in the program, then also delete it here.
|
|
||||||
|
|
||||||
OSSEC HIDS is distributed in the hope that it will be useful, but WITHOUT
|
|
||||||
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE.
|
|
||||||
See the GNU General Public License Version 2 below for more details.
|
|
||||||
|
|
||||||
-----------------------------------------------------------------------------
|
|
||||||
|
|
||||||
GNU GENERAL PUBLIC LICENSE
|
|
||||||
Version 2, June 1991
|
|
||||||
|
|
||||||
Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
|
|
||||||
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
|
||||||
Everyone is permitted to copy and distribute verbatim copies
|
|
||||||
of this license document, but changing it is not allowed.
|
|
||||||
|
|
||||||
Preamble
|
|
||||||
|
|
||||||
The licenses for most software are designed to take away your
|
|
||||||
freedom to share and change it. By contrast, the GNU General Public
|
|
||||||
License is intended to guarantee your freedom to share and change free
|
|
||||||
software--to make sure the software is free for all its users. This
|
|
||||||
General Public License applies to most of the Free Software
|
|
||||||
Foundation's software and to any other program whose authors commit to
|
|
||||||
using it. (Some other Free Software Foundation software is covered by
|
|
||||||
the GNU Lesser General Public License instead.) You can apply it to
|
|
||||||
your programs, too.
|
|
||||||
|
|
||||||
When we speak of free software, we are referring to freedom, not
|
|
||||||
price. Our General Public Licenses are designed to make sure that you
|
|
||||||
have the freedom to distribute copies of free software (and charge for
|
|
||||||
this service if you wish), that you receive source code or can get it
|
|
||||||
if you want it, that you can change the software or use pieces of it
|
|
||||||
in new free programs; and that you know you can do these things.
|
|
||||||
|
|
||||||
To protect your rights, we need to make restrictions that forbid
|
|
||||||
anyone to deny you these rights or to ask you to surrender the rights.
|
|
||||||
These restrictions translate to certain responsibilities for you if you
|
|
||||||
distribute copies of the software, or if you modify it.
|
|
||||||
|
|
||||||
For example, if you distribute copies of such a program, whether
|
|
||||||
gratis or for a fee, you must give the recipients all the rights that
|
|
||||||
you have. You must make sure that they, too, receive or can get the
|
|
||||||
source code. And you must show them these terms so they know their
|
|
||||||
rights.
|
|
||||||
|
|
||||||
We protect your rights with two steps: (1) copyright the software, and
|
|
||||||
(2) offer you this license which gives you legal permission to copy,
|
|
||||||
distribute and/or modify the software.
|
|
||||||
|
|
||||||
Also, for each author's protection and ours, we want to make certain
|
|
||||||
that everyone understands that there is no warranty for this free
|
|
||||||
software. If the software is modified by someone else and passed on, we
|
|
||||||
want its recipients to know that what they have is not the original, so
|
|
||||||
that any problems introduced by others will not reflect on the original
|
|
||||||
authors' reputations.
|
|
||||||
|
|
||||||
Finally, any free program is threatened constantly by software
|
|
||||||
patents. We wish to avoid the danger that redistributors of a free
|
|
||||||
program will individually obtain patent licenses, in effect making the
|
|
||||||
program proprietary. To prevent this, we have made it clear that any
|
|
||||||
patent must be licensed for everyone's free use or not licensed at all.
|
|
||||||
|
|
||||||
The precise terms and conditions for copying, distribution and
|
|
||||||
modification follow.
|
|
||||||
|
|
||||||
GNU GENERAL PUBLIC LICENSE
|
|
||||||
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
|
||||||
|
|
||||||
0. This License applies to any program or other work which contains
|
|
||||||
a notice placed by the copyright holder saying it may be distributed
|
|
||||||
under the terms of this General Public License. The "Program", below,
|
|
||||||
refers to any such program or work, and a "work based on the Program"
|
|
||||||
means either the Program or any derivative work under copyright law:
|
|
||||||
that is to say, a work containing the Program or a portion of it,
|
|
||||||
either verbatim or with modifications and/or translated into another
|
|
||||||
language. (Hereinafter, translation is included without limitation in
|
|
||||||
the term "modification".) Each licensee is addressed as "you".
|
|
||||||
|
|
||||||
Activities other than copying, distribution and modification are not
|
|
||||||
covered by this License; they are outside its scope. The act of
|
|
||||||
running the Program is not restricted, and the output from the Program
|
|
||||||
is covered only if its contents constitute a work based on the
|
|
||||||
Program (independent of having been made by running the Program).
|
|
||||||
Whether that is true depends on what the Program does.
|
|
||||||
|
|
||||||
1. You may copy and distribute verbatim copies of the Program's
|
|
||||||
source code as you receive it, in any medium, provided that you
|
|
||||||
conspicuously and appropriately publish on each copy an appropriate
|
|
||||||
copyright notice and disclaimer of warranty; keep intact all the
|
|
||||||
notices that refer to this License and to the absence of any warranty;
|
|
||||||
and give any other recipients of the Program a copy of this License
|
|
||||||
along with the Program.
|
|
||||||
|
|
||||||
You may charge a fee for the physical act of transferring a copy, and
|
|
||||||
you may at your option offer warranty protection in exchange for a fee.
|
|
||||||
|
|
||||||
2. You may modify your copy or copies of the Program or any portion
|
|
||||||
of it, thus forming a work based on the Program, and copy and
|
|
||||||
distribute such modifications or work under the terms of Section 1
|
|
||||||
above, provided that you also meet all of these conditions:
|
|
||||||
|
|
||||||
a) You must cause the modified files to carry prominent notices
|
|
||||||
stating that you changed the files and the date of any change.
|
|
||||||
|
|
||||||
b) You must cause any work that you distribute or publish, that in
|
|
||||||
whole or in part contains or is derived from the Program or any
|
|
||||||
part thereof, to be licensed as a whole at no charge to all third
|
|
||||||
parties under the terms of this License.
|
|
||||||
|
|
||||||
c) If the modified program normally reads commands interactively
|
|
||||||
when run, you must cause it, when started running for such
|
|
||||||
interactive use in the most ordinary way, to print or display an
|
|
||||||
announcement including an appropriate copyright notice and a
|
|
||||||
notice that there is no warranty (or else, saying that you provide
|
|
||||||
a warranty) and that users may redistribute the program under
|
|
||||||
these conditions, and telling the user how to view a copy of this
|
|
||||||
License. (Exception: if the Program itself is interactive but
|
|
||||||
does not normally print such an announcement, your work based on
|
|
||||||
the Program is not required to print an announcement.)
|
|
||||||
|
|
||||||
These requirements apply to the modified work as a whole. If
|
|
||||||
identifiable sections of that work are not derived from the Program,
|
|
||||||
and can be reasonably considered independent and separate works in
|
|
||||||
themselves, then this License, and its terms, do not apply to those
|
|
||||||
sections when you distribute them as separate works. But when you
|
|
||||||
distribute the same sections as part of a whole which is a work based
|
|
||||||
on the Program, the distribution of the whole must be on the terms of
|
|
||||||
this License, whose permissions for other licensees extend to the
|
|
||||||
entire whole, and thus to each and every part regardless of who wrote it.
|
|
||||||
|
|
||||||
Thus, it is not the intent of this section to claim rights or contest
|
|
||||||
your rights to work written entirely by you; rather, the intent is to
|
|
||||||
exercise the right to control the distribution of derivative or
|
|
||||||
collective works based on the Program.
|
|
||||||
|
|
||||||
In addition, mere aggregation of another work not based on the Program
|
|
||||||
with the Program (or with a work based on the Program) on a volume of
|
|
||||||
a storage or distribution medium does not bring the other work under
|
|
||||||
the scope of this License.
|
|
||||||
|
|
||||||
3. You may copy and distribute the Program (or a work based on it,
|
|
||||||
under Section 2) in object code or executable form under the terms of
|
|
||||||
Sections 1 and 2 above provided that you also do one of the following:
|
|
||||||
|
|
||||||
a) Accompany it with the complete corresponding machine-readable
|
|
||||||
source code, which must be distributed under the terms of Sections
|
|
||||||
1 and 2 above on a medium customarily used for software interchange; or,
|
|
||||||
|
|
||||||
b) Accompany it with a written offer, valid for at least three
|
|
||||||
years, to give any third party, for a charge no more than your
|
|
||||||
cost of physically performing source distribution, a complete
|
|
||||||
machine-readable copy of the corresponding source code, to be
|
|
||||||
distributed under the terms of Sections 1 and 2 above on a medium
|
|
||||||
customarily used for software interchange; or,
|
|
||||||
|
|
||||||
c) Accompany it with the information you received as to the offer
|
|
||||||
to distribute corresponding source code. (This alternative is
|
|
||||||
allowed only for noncommercial distribution and only if you
|
|
||||||
received the program in object code or executable form with such
|
|
||||||
an offer, in accord with Subsection b above.)
|
|
||||||
|
|
||||||
The source code for a work means the preferred form of the work for
|
|
||||||
making modifications to it. For an executable work, complete source
|
|
||||||
code means all the source code for all modules it contains, plus any
|
|
||||||
associated interface definition files, plus the scripts used to
|
|
||||||
control compilation and installation of the executable. However, as a
|
|
||||||
special exception, the source code distributed need not include
|
|
||||||
anything that is normally distributed (in either source or binary
|
|
||||||
form) with the major components (compiler, kernel, and so on) of the
|
|
||||||
operating system on which the executable runs, unless that component
|
|
||||||
itself accompanies the executable.
|
|
||||||
|
|
||||||
If distribution of executable or object code is made by offering
|
|
||||||
access to copy from a designated place, then offering equivalent
|
|
||||||
access to copy the source code from the same place counts as
|
|
||||||
distribution of the source code, even though third parties are not
|
|
||||||
compelled to copy the source along with the object code.
|
|
||||||
|
|
||||||
4. You may not copy, modify, sublicense, or distribute the Program
|
|
||||||
except as expressly provided under this License. Any attempt
|
|
||||||
otherwise to copy, modify, sublicense or distribute the Program is
|
|
||||||
void, and will automatically terminate your rights under this License.
|
|
||||||
However, parties who have received copies, or rights, from you under
|
|
||||||
this License will not have their licenses terminated so long as such
|
|
||||||
parties remain in full compliance.
|
|
||||||
|
|
||||||
5. You are not required to accept this License, since you have not
|
|
||||||
signed it. However, nothing else grants you permission to modify or
|
|
||||||
distribute the Program or its derivative works. These actions are
|
|
||||||
prohibited by law if you do not accept this License. Therefore, by
|
|
||||||
modifying or distributing the Program (or any work based on the
|
|
||||||
Program), you indicate your acceptance of this License to do so, and
|
|
||||||
all its terms and conditions for copying, distributing or modifying
|
|
||||||
the Program or works based on it.
|
|
||||||
|
|
||||||
6. Each time you redistribute the Program (or any work based on the
|
|
||||||
Program), the recipient automatically receives a license from the
|
|
||||||
original licensor to copy, distribute or modify the Program subject to
|
|
||||||
these terms and conditions. You may not impose any further
|
|
||||||
restrictions on the recipients' exercise of the rights granted herein.
|
|
||||||
You are not responsible for enforcing compliance by third parties to
|
|
||||||
this License.
|
|
||||||
|
|
||||||
7. If, as a consequence of a court judgment or allegation of patent
|
|
||||||
infringement or for any other reason (not limited to patent issues),
|
|
||||||
conditions are imposed on you (whether by court order, agreement or
|
|
||||||
otherwise) that contradict the conditions of this License, they do not
|
|
||||||
excuse you from the conditions of this License. If you cannot
|
|
||||||
distribute so as to satisfy simultaneously your obligations under this
|
|
||||||
License and any other pertinent obligations, then as a consequence you
|
|
||||||
may not distribute the Program at all. For example, if a patent
|
|
||||||
license would not permit royalty-free redistribution of the Program by
|
|
||||||
all those who receive copies directly or indirectly through you, then
|
|
||||||
the only way you could satisfy both it and this License would be to
|
|
||||||
refrain entirely from distribution of the Program.
|
|
||||||
|
|
||||||
If any portion of this section is held invalid or unenforceable under
|
|
||||||
any particular circumstance, the balance of the section is intended to
|
|
||||||
apply and the section as a whole is intended to apply in other
|
|
||||||
circumstances.
|
|
||||||
|
|
||||||
It is not the purpose of this section to induce you to infringe any
|
|
||||||
patents or other property right claims or to contest validity of any
|
|
||||||
such claims; this section has the sole purpose of protecting the
|
|
||||||
integrity of the free software distribution system, which is
|
|
||||||
implemented by public license practices. Many people have made
|
|
||||||
generous contributions to the wide range of software distributed
|
|
||||||
through that system in reliance on consistent application of that
|
|
||||||
system; it is up to the author/donor to decide if he or she is willing
|
|
||||||
to distribute software through any other system and a licensee cannot
|
|
||||||
impose that choice.
|
|
||||||
|
|
||||||
This section is intended to make thoroughly clear what is believed to
|
|
||||||
be a consequence of the rest of this License.
|
|
||||||
|
|
||||||
8. If the distribution and/or use of the Program is restricted in
|
|
||||||
certain countries either by patents or by copyrighted interfaces, the
|
|
||||||
original copyright holder who places the Program under this License
|
|
||||||
may add an explicit geographical distribution limitation excluding
|
|
||||||
those countries, so that distribution is permitted only in or among
|
|
||||||
countries not thus excluded. In such case, this License incorporates
|
|
||||||
the limitation as if written in the body of this License.
|
|
||||||
|
|
||||||
9. The Free Software Foundation may publish revised and/or new versions
|
|
||||||
of the General Public License from time to time. Such new versions will
|
|
||||||
be similar in spirit to the present version, but may differ in detail to
|
|
||||||
address new problems or concerns.
|
|
||||||
|
|
||||||
Each version is given a distinguishing version number. If the Program
|
|
||||||
specifies a version number of this License which applies to it and "any
|
|
||||||
later version", you have the option of following the terms and conditions
|
|
||||||
either of that version or of any later version published by the Free
|
|
||||||
Software Foundation. If the Program does not specify a version number of
|
|
||||||
this License, you may choose any version ever published by the Free Software
|
|
||||||
Foundation.
|
|
||||||
|
|
||||||
10. If you wish to incorporate parts of the Program into other free
|
|
||||||
programs whose distribution conditions are different, write to the author
|
|
||||||
to ask for permission. For software which is copyrighted by the Free
|
|
||||||
Software Foundation, write to the Free Software Foundation; we sometimes
|
|
||||||
make exceptions for this. Our decision will be guided by the two goals
|
|
||||||
of preserving the free status of all derivatives of our free software and
|
|
||||||
of promoting the sharing and reuse of software generally.
|
|
||||||
|
|
||||||
NO WARRANTY
|
|
||||||
|
|
||||||
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
|
|
||||||
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
|
|
||||||
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
|
|
||||||
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
|
|
||||||
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
|
||||||
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
|
|
||||||
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
|
|
||||||
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
|
|
||||||
REPAIR OR CORRECTION.
|
|
||||||
|
|
||||||
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
|
||||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
|
|
||||||
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
|
||||||
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
|
|
||||||
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
|
|
||||||
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
|
|
||||||
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
|
|
||||||
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
|
|
||||||
POSSIBILITY OF SUCH DAMAGES.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
|
|
||||||
-------------------------------------------------------------------------------
|
|
||||||
|
|
||||||
OpenSSL License
|
|
||||||
---------------
|
|
||||||
|
|
||||||
LICENSE ISSUES
|
|
||||||
==============
|
|
||||||
|
|
||||||
The OpenSSL toolkit stays under a dual license, i.e. both the conditions of
|
|
||||||
the OpenSSL License and the original SSLeay license apply to the toolkit.
|
|
||||||
See below for the actual license texts. Actually both licenses are BSD-style
|
|
||||||
Open Source licenses. In case of any license issues related to OpenSSL
|
|
||||||
please contact openssl-core@openssl.org.
|
|
||||||
|
|
||||||
OpenSSL License
|
|
||||||
---------------
|
|
||||||
|
|
||||||
/* ====================================================================
|
|
||||||
* Copyright (c) 1998-2001 The OpenSSL Project. All rights reserved.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
*
|
|
||||||
* 1. Redistributions of source code must retain the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
*
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in
|
|
||||||
* the documentation and/or other materials provided with the
|
|
||||||
* distribution.
|
|
||||||
*
|
|
||||||
* 3. All advertising materials mentioning features or use of this
|
|
||||||
* software must display the following acknowledgment:
|
|
||||||
* "This product includes software developed by the OpenSSL Project
|
|
||||||
* for use in the OpenSSL Toolkit. (http://www.openssl.org/)"
|
|
||||||
*
|
|
||||||
* 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to
|
|
||||||
* endorse or promote products derived from this software without
|
|
||||||
* prior written permission. For written permission, please contact
|
|
||||||
* openssl-core@openssl.org.
|
|
||||||
*
|
|
||||||
* 5. Products derived from this software may not be called "OpenSSL"
|
|
||||||
* nor may "OpenSSL" appear in their names without prior written
|
|
||||||
* permission of the OpenSSL Project.
|
|
||||||
*
|
|
||||||
* 6. Redistributions of any form whatsoever must retain the following
|
|
||||||
* acknowledgment:
|
|
||||||
* "This product includes software developed by the OpenSSL Project
|
|
||||||
* for use in the OpenSSL Toolkit (http://www.openssl.org/)"
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
|
|
||||||
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
|
||||||
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
|
|
||||||
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
|
||||||
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
|
|
||||||
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
|
||||||
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
|
|
||||||
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
|
||||||
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
|
|
||||||
* OF THE POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
* ====================================================================
|
|
||||||
*
|
|
||||||
* This product includes cryptographic software written by Eric Young
|
|
||||||
* (eay@cryptsoft.com). This product includes software written by Tim
|
|
||||||
* Hudson (tjh@cryptsoft.com).
|
|
||||||
*
|
|
||||||
*/
|
|
||||||
|
|
||||||
Original SSLeay License
|
|
||||||
-----------------------
|
|
||||||
|
|
||||||
/* Copyright (C) 1995-1998 Eric Young (eay@cryptsoft.com)
|
|
||||||
* All rights reserved.
|
|
||||||
*
|
|
||||||
* This package is an SSL implementation written
|
|
||||||
* by Eric Young (eay@cryptsoft.com).
|
|
||||||
* The implementation was written so as to conform with Netscapes SSL.
|
|
||||||
*
|
|
||||||
* This library is free for commercial and non-commercial use as long as
|
|
||||||
* the following conditions are aheared to. The following conditions
|
|
||||||
* apply to all code found in this distribution, be it the RC4, RSA,
|
|
||||||
* lhash, DES, etc., code; not just the SSL code. The SSL documentation
|
|
||||||
* included with this distribution is covered by the same copyright terms
|
|
||||||
* except that the holder is Tim Hudson (tjh@cryptsoft.com).
|
|
||||||
*
|
|
||||||
* Copyright remains Eric Young's, and as such any Copyright notices in
|
|
||||||
* the code are not to be removed.
|
|
||||||
* If this package is used in a product, Eric Young should be given attribution
|
|
||||||
* as the author of the parts of the library used.
|
|
||||||
* This can be in the form of a textual message at program startup or
|
|
||||||
* in documentation (online or textual) provided with the package.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
* 1. Redistributions of source code must retain the copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in the
|
|
||||||
* documentation and/or other materials provided with the distribution.
|
|
||||||
* 3. All advertising materials mentioning features or use of this software
|
|
||||||
* must display the following acknowledgement:
|
|
||||||
* "This product includes cryptographic software written by
|
|
||||||
* Eric Young (eay@cryptsoft.com)"
|
|
||||||
* The word 'cryptographic' can be left out if the routines from the library
|
|
||||||
* being used are not cryptographic related :-).
|
|
||||||
* 4. If you include any Windows specific code (or a derivative thereof) from
|
|
||||||
* the apps directory (application code) you must include an acknowledgement:
|
|
||||||
* "This product includes software written by Tim Hudson (tjh@cryptsoft.com)"
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND
|
|
||||||
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
|
||||||
* ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
|
|
||||||
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
|
||||||
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
|
||||||
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
|
||||||
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
|
||||||
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
|
||||||
* SUCH DAMAGE.
|
|
||||||
*
|
|
||||||
* The licence and distribution terms for any publically available version or
|
|
||||||
* derivative of this code cannot be changed. i.e. this code cannot simply be
|
|
||||||
* copied and put under another distribution licence
|
|
||||||
* [including the GNU Public Licence.]
|
|
||||||
*/
|
|
||||||
77
README.md
77
README.md
@@ -1,76 +1,21 @@
|
|||||||
# Wazuh containers for Docker
|
# IMPORTANT NOTE
|
||||||
|
|
||||||
[](https://wazuh.com/community/join-us-on-slack/)
|
The first time than you runt this container can take a while until kibana finish the configuration, the Wazuh plugin can take a few minutes until finish the instalation, please be patient.
|
||||||
[](https://groups.google.com/forum/#!forum/wazuh)
|
|
||||||
[](https://documentation.wazuh.com)
|
|
||||||
[](https://wazuh.com)
|
|
||||||
|
|
||||||
In this repository you will find the containers to run:
|
# Docker container Wazuh 2.0 + ELK(5.4.2)
|
||||||
|
|
||||||
* wazuh: It runs the Wazuh manager, Wazuh API and Filebeat (for integration with Elastic Stack)
|
This Docker container source files can be found in our [Wazuh Github repository](https://github.com/wazuh/wazuh). It includes both an OSSEC manager and an Elasticsearch single-node cluster, with Logstash and Kibana. You can find more information on how these components work together in our documentation.
|
||||||
* wazuh-kibana: Provides a web user interface to browse through alerts data. It includes Wazuh plugin for Kibana, that allows you to visualize agents configuration and status.
|
|
||||||
* wazuh-nginx: Proxies the Kibana container, adding HTTPS (via self-signed SSL certificate) and [Basic authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication#Basic_authentication_scheme).
|
|
||||||
* wazuh-elasticsearch: An Elasticsearch container (working as a single-node cluster) using Elastic Stack Docker images. **Be aware to increase the `vm.max_map_count` setting, as it's detailed in the [Wazuh documentation](https://documentation.wazuh.com/current/docker/wazuh-container.html#increase-max-map-count-on-your-host-linux).**
|
|
||||||
|
|
||||||
In addition, a docker-compose file is provided to launch the containers mentioned above.
|
|
||||||
|
|
||||||
* Elasticsearch cluster. In the Elasticsearch Dockerfile we can visualize variables to configure an Elasticsearch Cluster. These variables are used in the file *config_cluster.sh* to set them in the *elasticsearch.yml* configuration file. You can see the meaning of the node variables [here](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-node.html) and other cluster settings [here](https://github.com/elastic/elasticsearch/blob/master/distribution/src/config/elasticsearch.yml).
|
|
||||||
|
|
||||||
## Documentation
|
## Documentation
|
||||||
|
|
||||||
* [Wazuh full documentation](http://documentation.wazuh.com)
|
* [Full documentation](http://documentation.wazuh.com)
|
||||||
* [Wazuh documentation for Docker](https://documentation.wazuh.com/current/docker/index.html)
|
* [Wazuh-docker module documentation](https://documentation.wazuh.com/current/docker/index.html)
|
||||||
* [Docker hub](https://hub.docker.com/u/wazuh)
|
* [Hub docker](https://hub.docker.com/u/wazuh)
|
||||||
|
|
||||||
## Directory structure
|
## Credits and thank you
|
||||||
|
|
||||||
wazuh-docker
|
These Docker containers are based on "deviantony" dockerfiles which can be found at [https://github.com/deviantony/docker-elk] (https://github.com/deviantony/docker-elk), and "xetus-oss" dockerfiles, which can be found at [https://github.com/xetus-oss/docker-ossec-server](https://github.com/xetus-oss/docker-ossec-server). We created our own fork, which we test and maintain. Thank you Anthony Lapenna for your contribution to the community.
|
||||||
├── docker-compose.yml
|
|
||||||
├── LICENSE
|
|
||||||
├── README.md
|
|
||||||
├── CHANGELOG.md
|
|
||||||
├── VERSION
|
|
||||||
├── test.txt
|
|
||||||
└── wazuh
|
|
||||||
├── config
|
|
||||||
│ ├── 00-decrypt_credentials.sh
|
|
||||||
│ ├── 01-wazuh.sh
|
|
||||||
│ ├── 02-set_filebeat_destination.sh
|
|
||||||
│ ├── 03-config_filebeat.sh
|
|
||||||
│ ├── 20-ossec-configuration.sh
|
|
||||||
│ ├── 25-backups.sh
|
|
||||||
│ ├── 35-remove_credentials_file.sh
|
|
||||||
│ ├── 85-save_wazuh_version.sh
|
|
||||||
│ ├── create_user.py
|
|
||||||
│ ├── entrypoint.sh
|
|
||||||
│ ├── filebeat_to_elasticsearch.yml
|
|
||||||
│ ├── filebeat_to_logstash.yml
|
|
||||||
│ ├── filebeat.runit.service
|
|
||||||
│ ├── permanent_data.env
|
|
||||||
│ ├── postfix.runit.service
|
|
||||||
│ └── wazuh.runit.service
|
|
||||||
└── Dockerfile
|
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
## Branches
|
* [Wazuh website](http://wazuh.com)
|
||||||
|
|
||||||
* `stable` branch on correspond to the latest Wazuh-Docker stable version.
|
|
||||||
* `master` branch contains the latest code, be aware of possible bugs on this branch.
|
|
||||||
* `Wazuh.Version_ElasticStack.Version` (for example 3.10.2_7.3.2) branch. This branch contains the current release referenced in Docker Hub. The container images are installed under the current version of this branch.
|
|
||||||
|
|
||||||
## Credits and Thank you
|
|
||||||
|
|
||||||
These Docker containers are based on:
|
|
||||||
|
|
||||||
* "deviantony" dockerfiles which can be found at [https://github.com/deviantony/docker-elk](https://github.com/deviantony/docker-elk)
|
|
||||||
* "xetus-oss" dockerfiles, which can be found at [https://github.com/xetus-oss/docker-ossec-server](https://github.com/xetus-oss/docker-ossec-server)
|
|
||||||
|
|
||||||
We thank you them and everyone else who has contributed to this project.
|
|
||||||
|
|
||||||
## License and copyright
|
|
||||||
|
|
||||||
Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
## Web references
|
|
||||||
|
|
||||||
[Wazuh website](http://wazuh.com)
|
|
||||||
|
|||||||
@@ -1,92 +1,71 @@
|
|||||||
# Wazuh App Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
version: '2'
|
version: '2'
|
||||||
|
|
||||||
services:
|
services:
|
||||||
wazuh:
|
wazuh:
|
||||||
image: wazuh/wazuh:3.10.2_7.3.2
|
image: wazuh/wazuh
|
||||||
hostname: wazuh-manager
|
hostname: wazuh-manager
|
||||||
restart: always
|
restart: always
|
||||||
ports:
|
ports:
|
||||||
- "1514:1514/udp"
|
- "1514/udp:1514/udp"
|
||||||
- "1515:1515"
|
- "1515:1515"
|
||||||
- "514:514/udp"
|
- "514/udp:514/udp"
|
||||||
- "55000:55000"
|
- "55000:55000"
|
||||||
# depends_on:
|
networks:
|
||||||
# - logstash
|
- docker_elk
|
||||||
# logstash:
|
# volumes:
|
||||||
# image: wazuh/wazuh-elasticsearch:3.10.2_7.3.2
|
# - my-path:/var/ossec/data
|
||||||
# hostname: logstash
|
# - my-path:/etc/postfix
|
||||||
# restart: always
|
|
||||||
# links:
|
|
||||||
# - elasticsearch:elasticsearch
|
|
||||||
# ports:
|
|
||||||
# - "5000:5000"
|
|
||||||
# depends_on:
|
|
||||||
# - elasticsearch
|
|
||||||
# environment:
|
|
||||||
# - LS_HEAP_SIZE=2048m
|
|
||||||
# - SECURITY_ENABLED=no
|
|
||||||
# - SECURITY_LOGSTASH_USER=service_logstash
|
|
||||||
# - SECURITY_LOGSTASH_PASS=logstash_pass
|
|
||||||
# - LOGSTASH_OUTPUT=https://elasticsearch:9200
|
|
||||||
# - ELASTICSEARCH_URL=https://elasticsearch:9200
|
|
||||||
# - SECURITY_CA_PEM=server.TEST-CA-signed.pem
|
|
||||||
elasticsearch:
|
|
||||||
image: wazuh/wazuh-elasticsearch:3.10.2_7.3.2
|
|
||||||
hostname: elasticsearch
|
|
||||||
restart: always
|
|
||||||
ports:
|
|
||||||
- "9200:9200"
|
|
||||||
environment:
|
|
||||||
- "ES_JAVA_OPTS=-Xms1g -Xmx1g"
|
|
||||||
- ELASTICSEARCH_PROTOCOL=http
|
|
||||||
- ELASTICSEARCH_IP=elasticsearch
|
|
||||||
- ELASTICSEARCH_PORT=9200
|
|
||||||
- SECURITY_ENABLED=no
|
|
||||||
- SECURITY_ELASTIC_PASSWORD=elastic_pass
|
|
||||||
- SECURITY_MAIN_NODE=elasticsearch
|
|
||||||
- ELASTIC_CLUSTER=true
|
|
||||||
- CLUSTER_NODE_MASTER=true
|
|
||||||
- CLUSTER_MASTER_NODE_NAME=elasticsearch
|
|
||||||
- CLUSTER_NODE_DATA=true
|
|
||||||
- CLUSTER_NODE_INGEST=true
|
|
||||||
- CLUSTER_MAX_NODES=3
|
|
||||||
ulimits:
|
|
||||||
memlock:
|
|
||||||
soft: -1
|
|
||||||
hard: -1
|
|
||||||
mem_limit: 2g
|
|
||||||
|
|
||||||
kibana:
|
|
||||||
image: wazuh/wazuh-kibana:3.10.2_7.3.2
|
|
||||||
hostname: kibana
|
|
||||||
restart: always
|
|
||||||
depends_on:
|
depends_on:
|
||||||
- elasticsearch
|
- elasticsearch
|
||||||
|
logstash:
|
||||||
|
image: wazuh/wazuh-logstash
|
||||||
|
hostname: logstash
|
||||||
|
restart: always
|
||||||
|
command: -f /etc/logstash/conf.d/
|
||||||
|
# volumes:
|
||||||
|
# - my-path:/etc/logstash/conf.d
|
||||||
links:
|
links:
|
||||||
- elasticsearch:elasticsearch
|
- kibana
|
||||||
- wazuh:wazuh
|
- elasticsearch
|
||||||
|
ports:
|
||||||
|
- "5000:5000"
|
||||||
|
networks:
|
||||||
|
- docker_elk
|
||||||
|
depends_on:
|
||||||
|
- elasticsearch
|
||||||
environment:
|
environment:
|
||||||
- ELASTICSEARCH_URL=https://elasticsearch:9200
|
- LS_HEAP_SIZE=2048m
|
||||||
- SECURITY_ENABLED=no
|
elasticsearch:
|
||||||
- SECURITY_KIBANA_USER=service_kibana
|
image: elasticsearch:5.4.2
|
||||||
- SECURITY_KIBANA_PASS=kibana_pass
|
hostname: elasticsearch
|
||||||
- ELASTICSEARCH_KIBANA_IP=https://elasticsearch:9200
|
restart: always
|
||||||
- SECURITY_CA_PEM=server.TEST-CA-signed.pem
|
command: elasticsearch -E node.name="node-1" -E cluster.name="wazuh" -E network.host=0.0.0.0
|
||||||
|
ports:
|
||||||
|
- "9200:9200"
|
||||||
|
- "9300:9300"
|
||||||
|
environment:
|
||||||
|
ES_JAVA_OPTS: "-Xms2g -Xmx2g"
|
||||||
|
# volumes:
|
||||||
|
# - my-path:/usr/share/elasticsearch/data
|
||||||
|
networks:
|
||||||
|
- docker_elk
|
||||||
|
kibana:
|
||||||
|
image: wazuh/wazuh-kibana
|
||||||
|
hostname: kibana
|
||||||
|
restart: always
|
||||||
ports:
|
ports:
|
||||||
- "5601:5601"
|
- "5601:5601"
|
||||||
|
networks:
|
||||||
nginx:
|
- docker_elk
|
||||||
image: wazuh/wazuh-nginx:3.10.2_7.3.2
|
|
||||||
hostname: nginx
|
|
||||||
restart: always
|
|
||||||
environment:
|
|
||||||
- NGINX_PORT=443
|
|
||||||
- NGINX_CREDENTIALS
|
|
||||||
ports:
|
|
||||||
- "80:80"
|
|
||||||
- "443:443"
|
|
||||||
depends_on:
|
depends_on:
|
||||||
- kibana
|
- elasticsearch
|
||||||
links:
|
entrypoint: sh wait-for-it.sh elasticsearch
|
||||||
- kibana:kibana
|
# environment:
|
||||||
|
# - "WAZUH_KIBANA_PLUGIN_URL=http://your.repo/wazuhapp-2.0_5.4.2.zip"
|
||||||
|
|
||||||
|
networks:
|
||||||
|
docker_elk:
|
||||||
|
driver: bridge
|
||||||
|
ipam:
|
||||||
|
config:
|
||||||
|
- subnet: 172.25.0.0/24
|
||||||
|
|||||||
BIN
images/image-1.png
Normal file
BIN
images/image-1.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 81 KiB |
BIN
images/image-2.png
Normal file
BIN
images/image-2.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 86 KiB |
7
kibana/Dockerfile
Normal file
7
kibana/Dockerfile
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
FROM kibana:5.4.2
|
||||||
|
|
||||||
|
RUN apt-get update && apt-get install -y curl
|
||||||
|
|
||||||
|
COPY ./config/kibana.yml /opt/kibana/config/kibana.yml
|
||||||
|
|
||||||
|
COPY config/wait-for-it.sh /
|
||||||
92
kibana/config/kibana.yml
Normal file
92
kibana/config/kibana.yml
Normal file
@@ -0,0 +1,92 @@
|
|||||||
|
# Kibana is served by a back end server. This setting specifies the port to use.
|
||||||
|
server.port: 5601
|
||||||
|
|
||||||
|
# This setting specifies the IP address of the back end server.
|
||||||
|
server.host: "0.0.0.0"
|
||||||
|
|
||||||
|
# Enables you to specify a path to mount Kibana at if you are running behind a proxy. This setting
|
||||||
|
# cannot end in a slash.
|
||||||
|
# server.basePath: ""
|
||||||
|
|
||||||
|
# The maximum payload size in bytes for incoming server requests.
|
||||||
|
# server.maxPayloadBytes: 1048576
|
||||||
|
|
||||||
|
# The Kibana server's name. This is used for display purposes.
|
||||||
|
# server.name: "your-hostname"
|
||||||
|
|
||||||
|
# The URL of the Elasticsearch instance to use for all your queries.
|
||||||
|
elasticsearch.url: "http://elasticsearch:9200"
|
||||||
|
|
||||||
|
# When this setting’s value is true Kibana uses the hostname specified in the server.host
|
||||||
|
# setting. When the value of this setting is false, Kibana uses the hostname of the host
|
||||||
|
# that connects to this Kibana instance.
|
||||||
|
# elasticsearch.preserveHost: true
|
||||||
|
|
||||||
|
# Kibana uses an index in Elasticsearch to store saved searches, visualizations and
|
||||||
|
# dashboards. Kibana creates a new index if the index doesn’t already exist.
|
||||||
|
# kibana.index: ".kibana"
|
||||||
|
|
||||||
|
# The default application to load.
|
||||||
|
# kibana.defaultAppId: "discover"
|
||||||
|
|
||||||
|
# If your Elasticsearch is protected with basic authentication, these settings provide
|
||||||
|
# the username and password that the Kibana server uses to perform maintenance on the Kibana
|
||||||
|
# index at startup. Your Kibana users still need to authenticate with Elasticsearch, which
|
||||||
|
# is proxied through the Kibana server.
|
||||||
|
# elasticsearch.username: "user"
|
||||||
|
# elasticsearch.password: "pass"
|
||||||
|
|
||||||
|
# Paths to the PEM-format SSL certificate and SSL key files, respectively. These
|
||||||
|
# files enable SSL for outgoing requests from the Kibana server to the browser.
|
||||||
|
# server.ssl.cert: /path/to/your/server.crt
|
||||||
|
# server.ssl.key: /path/to/your/server.key
|
||||||
|
|
||||||
|
# Optional settings that provide the paths to the PEM-format SSL certificate and key files.
|
||||||
|
# These files validate that your Elasticsearch backend uses the same key files.
|
||||||
|
# elasticsearch.ssl.cert: /path/to/your/client.crt
|
||||||
|
# elasticsearch.ssl.key: /path/to/your/client.key
|
||||||
|
|
||||||
|
# Optional setting that enables you to specify a path to the PEM file for the certificate
|
||||||
|
# authority for your Elasticsearch instance.
|
||||||
|
# elasticsearch.ssl.ca: /path/to/your/CA.pem
|
||||||
|
|
||||||
|
# To disregard the validity of SSL certificates, change this setting’s value to false.
|
||||||
|
# elasticsearch.ssl.verify: true
|
||||||
|
|
||||||
|
# Time in milliseconds to wait for Elasticsearch to respond to pings. Defaults to the value of
|
||||||
|
# the elasticsearch.requestTimeout setting.
|
||||||
|
# elasticsearch.pingTimeout: 1500
|
||||||
|
|
||||||
|
# Time in milliseconds to wait for responses from the back end or Elasticsearch. This value
|
||||||
|
# must be a positive integer.
|
||||||
|
# elasticsearch.requestTimeout: 30000
|
||||||
|
|
||||||
|
# List of Kibana client-side headers to send to Elasticsearch. To send *no* client-side
|
||||||
|
# headers, set this value to [] (an empty list).
|
||||||
|
# elasticsearch.requestHeadersWhitelist: [ authorization ]
|
||||||
|
|
||||||
|
# Time in milliseconds for Elasticsearch to wait for responses from shards. Set to 0 to disable.
|
||||||
|
# elasticsearch.shardTimeout: 0
|
||||||
|
|
||||||
|
# Time in milliseconds to wait for Elasticsearch at Kibana startup before retrying.
|
||||||
|
# elasticsearch.startupTimeout: 5000
|
||||||
|
|
||||||
|
# Specifies the path where Kibana creates the process ID file.
|
||||||
|
# pid.file: /var/run/kibana.pid
|
||||||
|
|
||||||
|
# Enables you specify a file where Kibana stores log output.
|
||||||
|
# logging.dest: stdout
|
||||||
|
|
||||||
|
# Set the value of this setting to true to suppress all logging output.
|
||||||
|
# logging.silent: false
|
||||||
|
|
||||||
|
# Set the value of this setting to true to suppress all logging output other than error messages.
|
||||||
|
# logging.quiet: false
|
||||||
|
|
||||||
|
# Set the value of this setting to true to log all events, including system usage information
|
||||||
|
# and all requests.
|
||||||
|
# logging.verbose: false
|
||||||
|
|
||||||
|
# Set the interval in milliseconds to sample system and process performance
|
||||||
|
# metrics. Minimum is 100ms. Defaults to 10000.
|
||||||
|
# ops.interval: 10000
|
||||||
25
kibana/config/wait-for-it.sh
Normal file
25
kibana/config/wait-for-it.sh
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
host="$1"
|
||||||
|
shift
|
||||||
|
cmd="kibana"
|
||||||
|
WAZUH_KIBANA_PLUGIN_URL=${WAZUH_KIBANA_PLUGIN_URL:-https://packages.wazuh.com/wazuhapp/wazuhapp-2.0_5.4.2.zip}
|
||||||
|
|
||||||
|
until curl -XGET $host:9200; do
|
||||||
|
>&2 echo "Elastic is unavailable - sleeping"
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
|
|
||||||
|
sleep 30
|
||||||
|
|
||||||
|
>&2 echo "Elastic is up - executing command"
|
||||||
|
|
||||||
|
if /usr/share/kibana/bin/kibana-plugin list | grep wazuh; then
|
||||||
|
echo "Wazuh APP already installed"
|
||||||
|
else
|
||||||
|
/usr/share/kibana/bin/kibana-plugin install ${WAZUH_KIBANA_PLUGIN_URL}
|
||||||
|
fi
|
||||||
|
|
||||||
|
exec $cmd
|
||||||
12
logstash/Dockerfile
Normal file
12
logstash/Dockerfile
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
FROM logstash:5.4.2
|
||||||
|
|
||||||
|
RUN apt-get update
|
||||||
|
|
||||||
|
COPY config/logstash.conf /etc/logstash/conf.d/logstash.conf
|
||||||
|
COPY config/wazuh-elastic5-template.json /etc/logstash/wazuh-elastic5-template.json
|
||||||
|
|
||||||
|
|
||||||
|
ADD config/run.sh /tmp/run.sh
|
||||||
|
RUN chmod 755 /tmp/run.sh
|
||||||
|
|
||||||
|
ENTRYPOINT ["/tmp/run.sh"]
|
||||||
43
logstash/config/logstash.conf
Normal file
43
logstash/config/logstash.conf
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
# Wazuh - Logstash configuration file
|
||||||
|
## Remote Wazuh Manager - Filebeat input
|
||||||
|
input {
|
||||||
|
beats {
|
||||||
|
port => 5000
|
||||||
|
codec => "json_lines"
|
||||||
|
# ssl => true
|
||||||
|
# ssl_certificate => "/etc/logstash/logstash.crt"
|
||||||
|
# ssl_key => "/etc/logstash/logstash.key"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
## Local Wazuh Manager - JSON file input
|
||||||
|
#input {
|
||||||
|
# file {
|
||||||
|
# type => "wazuh-alerts"
|
||||||
|
# path => "/var/ossec/logs/alerts/alerts.json"
|
||||||
|
# codec => "json"
|
||||||
|
# }
|
||||||
|
#}
|
||||||
|
filter {
|
||||||
|
geoip {
|
||||||
|
source => "srcip"
|
||||||
|
target => "GeoLocation"
|
||||||
|
fields => ["city_name", "continent_code", "country_code2", "country_name", "region_name", "location"]
|
||||||
|
}
|
||||||
|
date {
|
||||||
|
match => ["timestamp", "ISO8601"]
|
||||||
|
target => "@timestamp"
|
||||||
|
}
|
||||||
|
mutate {
|
||||||
|
remove_field => [ "timestamp", "beat", "fields", "input_type", "tags", "count", "@version", "log", "offset", "type"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
output {
|
||||||
|
elasticsearch {
|
||||||
|
hosts => ["elasticsearch:9200"]
|
||||||
|
index => "wazuh-alerts-%{+YYYY.MM.dd}"
|
||||||
|
document_type => "wazuh"
|
||||||
|
template => "/etc/logstash/wazuh-elastic5-template.json"
|
||||||
|
template_name => "wazuh"
|
||||||
|
template_overwrite => true
|
||||||
|
}
|
||||||
|
}
|
||||||
31
logstash/config/run.sh
Normal file
31
logstash/config/run.sh
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
#
|
||||||
|
# OSSEC container bootstrap. See the README for information of the environment
|
||||||
|
# variables expected by this script.
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
# Apply Templates
|
||||||
|
#
|
||||||
|
|
||||||
|
set -e
|
||||||
|
host="elasticsearch"
|
||||||
|
until curl -XGET $host:9200; do
|
||||||
|
>&2 echo "Elastic is unavailable - sleeping"
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
|
|
||||||
|
# Add logstash as command if needed
|
||||||
|
if [ "${1:0:1}" = '-' ]; then
|
||||||
|
set -- logstash "$@"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Run as user "logstash" if the command is "logstash"
|
||||||
|
if [ "$1" = 'logstash' ]; then
|
||||||
|
set -- gosu logstash "$@"
|
||||||
|
fi
|
||||||
|
|
||||||
|
exec "$@"
|
||||||
620
logstash/config/wazuh-elastic5-template.json
Normal file
620
logstash/config/wazuh-elastic5-template.json
Normal file
@@ -0,0 +1,620 @@
|
|||||||
|
{
|
||||||
|
"order": 0,
|
||||||
|
"template": "wazuh*",
|
||||||
|
"settings": {
|
||||||
|
"index.refresh_interval": "5s"
|
||||||
|
},
|
||||||
|
"mappings": {
|
||||||
|
"wazuh": {
|
||||||
|
"dynamic_templates": [
|
||||||
|
{
|
||||||
|
"string_as_keyword": {
|
||||||
|
"match_mapping_type": "string",
|
||||||
|
"mapping": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"properties": {
|
||||||
|
"@timestamp": {
|
||||||
|
"type": "date",
|
||||||
|
"format": "dateOptionalTime"
|
||||||
|
},
|
||||||
|
"@version": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"agent": {
|
||||||
|
"properties": {
|
||||||
|
"ip": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"manager": {
|
||||||
|
"properties": {
|
||||||
|
"name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"dstuser": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"AlertsFile": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"full_log": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"previous_log": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"GeoLocation": {
|
||||||
|
"properties": {
|
||||||
|
"area_code": {
|
||||||
|
"type": "long"
|
||||||
|
},
|
||||||
|
"city_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"continent_code": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"coordinates": {
|
||||||
|
"type": "double"
|
||||||
|
},
|
||||||
|
"country_code2": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"country_code3": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"country_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"dma_code": {
|
||||||
|
"type": "long"
|
||||||
|
},
|
||||||
|
"ip": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"latitude": {
|
||||||
|
"type": "double"
|
||||||
|
},
|
||||||
|
"location": {
|
||||||
|
"type": "geo_point"
|
||||||
|
},
|
||||||
|
"longitude": {
|
||||||
|
"type": "double"
|
||||||
|
},
|
||||||
|
"postal_code": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"real_region_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"region_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"timezone": {
|
||||||
|
"type": "text"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"host": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"syscheck": {
|
||||||
|
"properties": {
|
||||||
|
"path": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"sha1_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"sha1_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"uid_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"uid_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"gid_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"gid_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"perm_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"perm_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"md5_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"md5_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"gname_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"gname_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"inode_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"inode_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"mtime_after": {
|
||||||
|
"type": "date",
|
||||||
|
"format": "dateOptionalTime",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"mtime_before": {
|
||||||
|
"type": "date",
|
||||||
|
"format": "dateOptionalTime",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"uname_after": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"uname_before": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"size_before": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"size_after": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"diff": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"event": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"location": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"message": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"offset": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"rule": {
|
||||||
|
"properties": {
|
||||||
|
"description": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"groups": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"level": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"cve": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"info": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"frequency": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"firedtimes": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"cis": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"pci_dss": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"decoder": {
|
||||||
|
"properties": {
|
||||||
|
"parent": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"ftscomment": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"fts": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"accumulate": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"srcip": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"protocol": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"action": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"dstip": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"dstport": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"srcuser": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"program_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"status": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"command": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"url": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"data": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"system_name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"type": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"title": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"oscap": {
|
||||||
|
"properties": {
|
||||||
|
"check.title": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"check.id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"check.result": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"check.severity": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"check.description": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"check.rationale": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"check.references": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"check.identifiers": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"check.oval.id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.content": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.benchmark.id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.profile.title": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.profile.id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.score": {
|
||||||
|
"type": "double",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"scan.return_code": {
|
||||||
|
"type": "long",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"audit": {
|
||||||
|
"properties": {
|
||||||
|
"type": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"id": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"syscall": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"exit": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"ppid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"pid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"auid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"uid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"gid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"euid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"suid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"fsuid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"egid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"sgid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"fsgid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"tty": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"session": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"command": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"exe": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"key": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"cwd": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"directory.name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"directory.inode": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"directory.mode": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"file.name": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"file.inode": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"file.mode": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"acct": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"dev": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"enforcing": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"list": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"old-auid": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"old-ses": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"old_enforcing": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"old_prom": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"op": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"prom": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"res": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"srcip": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"subj": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
},
|
||||||
|
"success": {
|
||||||
|
"type": "keyword",
|
||||||
|
"doc_values": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"agent": {
|
||||||
|
"properties": {
|
||||||
|
"@timestamp": {
|
||||||
|
"type": "date",
|
||||||
|
"format": "dateOptionalTime"
|
||||||
|
},
|
||||||
|
"status": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"ip": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"host": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"name": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"id": {
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
126
wazuh/Dockerfile
126
wazuh/Dockerfile
@@ -1,115 +1,35 @@
|
|||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
FROM centos:latest
|
||||||
FROM waystonesystems/baseimage-centos:0.2.0
|
|
||||||
|
|
||||||
# Arguments
|
COPY config/*.repo /etc/yum.repos.d/
|
||||||
ARG FILEBEAT_VERSION=7.10.2
|
|
||||||
ARG WAZUH_VERSION=4.7.2-0.debug
|
|
||||||
|
|
||||||
# Environment variables
|
RUN yum -y update; yum clean all;
|
||||||
ENV API_USER="foo" \
|
RUN yum -y install epel-release openssl useradd; yum clean all
|
||||||
API_PASS="bar"
|
RUN yum -y install postfix mailx cyrus-sasl cyrus-sasl-plain; yum clean all
|
||||||
|
RUN groupadd -g 1000 ossec
|
||||||
|
RUN useradd -u 1000 -g 1000 ossec
|
||||||
|
RUN yum install -y wazuh-manager wazuh-api
|
||||||
|
|
||||||
ARG TEMPLATE_VERSION="4.0"
|
|
||||||
ENV FILEBEAT_DESTINATION="elasticsearch"
|
|
||||||
|
|
||||||
# Install packages
|
ADD config/data_dirs.env /data_dirs.env
|
||||||
RUN set -x && \
|
ADD config/init.bash /init.bash
|
||||||
groupadd -g 1000 wazuh && \
|
|
||||||
useradd -u 1000 -g 1000 -d /var/ossec wazuh && \
|
|
||||||
# Retrieve DEV package
|
|
||||||
#curl -o /tmp/wazuh-manager-$WAZUH_VERSION.x86_64.rpm https://packages-dev.wazuh.com/pre-release/yum/wazuh-manager-$WAZUH_VERSION.x86_64.rpm && \
|
|
||||||
# Retrieve PROD package
|
|
||||||
curl -o /tmp/wazuh-manager-$WAZUH_VERSION.x86_64.rpm https://packages.wazuh.com/cloud/4.7.x/rpm/wazuh-manager-$WAZUH_VERSION.x86_64.rpm && \
|
|
||||||
yum update -y && \
|
|
||||||
yum upgrade -y &&\
|
|
||||||
yum install -y openssl vim expect python-boto python-pip python-cryptography postfix bsd-mailx mailx ca-certificates && \
|
|
||||||
yum localinstall -y /tmp/wazuh-manager-$WAZUH_VERSION.x86_64.rpm && \
|
|
||||||
rm -f /tmp/wazuh-manager-$WAZUH_VERSION.x86_64.rpm && \
|
|
||||||
yum clean all && \
|
|
||||||
rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* && \
|
|
||||||
rm -f /var/ossec/logs/alerts/*/*/* && \
|
|
||||||
rm -f /var/ossec/logs/archives/*/*/* && \
|
|
||||||
rm -f /var/ossec/logs/firewall/*/*/* && \
|
|
||||||
rm -f /var/ossec/logs/api/*/*/* && \
|
|
||||||
rm -f /var/ossec/logs/cluster/*/*/* && \
|
|
||||||
rm -f /var/ossec/logs/wazuh/*/*/* && \
|
|
||||||
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-${FILEBEAT_VERSION}-x86_64.rpm && \
|
|
||||||
rpm -vi filebeat-${FILEBEAT_VERSION}-x86_64.rpm && rm -f filebeat-${FILEBEAT_VERSION}-x86_64.rpm
|
|
||||||
|
|
||||||
# Services
|
|
||||||
RUN mkdir /etc/service/wazuh && \
|
|
||||||
mkdir /etc/service/postfix && \
|
|
||||||
mkdir /etc/service/filebeat
|
|
||||||
|
|
||||||
COPY config/wazuh.runit.service /etc/service/wazuh/run
|
|
||||||
COPY config/postfix.runit.service /etc/service/postfix/run
|
|
||||||
COPY config/filebeat.runit.service /etc/service/filebeat/run
|
|
||||||
|
|
||||||
RUN chmod +x /etc/service/wazuh/run && \
|
|
||||||
chmod +x /etc/service/postfix/run && \
|
|
||||||
chmod +x /etc/service/filebeat/run
|
|
||||||
|
|
||||||
# Copy configuration files from repository
|
|
||||||
COPY config/filebeat_to_elasticsearch.yml ./
|
|
||||||
COPY config/filebeat_to_logstash.yml ./
|
|
||||||
|
|
||||||
# Prepare permanent data
|
|
||||||
# Sync calls are due to https://github.com/docker/docker/issues/9547
|
# Sync calls are due to https://github.com/docker/docker/issues/9547
|
||||||
COPY config/permanent_data.env /permanent_data.env
|
RUN chmod 755 /init.bash &&\
|
||||||
COPY config/permanent_data.sh /permanent_data.sh
|
sync && /init.bash &&\
|
||||||
RUN chmod 755 /permanent_data.sh && \
|
sync && rm /init.bash
|
||||||
sync && \
|
|
||||||
/permanent_data.sh && \
|
|
||||||
sync && \
|
|
||||||
rm /permanent_data.sh
|
|
||||||
|
|
||||||
# Setting volumes
|
|
||||||
# Once we declared a volume in the Dockerfile, changes made to that path will have no effect. In other words, any changes made
|
|
||||||
# to the these paths from here to the end of the Dockerfile will not be taken into account when mounting the volume.
|
|
||||||
VOLUME ["/var/ossec/api/configuration"]
|
|
||||||
VOLUME ["/var/ossec/etc"]
|
|
||||||
VOLUME ["/var/ossec/logs"]
|
|
||||||
VOLUME ["/var/ossec/queue"]
|
|
||||||
VOLUME ["/var/ossec/agentless"]
|
|
||||||
VOLUME ["/var/ossec/var/multigroups"]
|
|
||||||
VOLUME ["/var/ossec/integrations"]
|
|
||||||
VOLUME ["/var/ossec/active-response/bin"]
|
|
||||||
VOLUME ["/var/ossec/wodles"]
|
|
||||||
VOLUME ["/etc/filebeat"]
|
|
||||||
VOLUME ["/etc/postfix"]
|
|
||||||
VOLUME ["/var/lib/filebeat"]
|
|
||||||
|
|
||||||
# Prepare entrypoint scripts
|
RUN curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.4.2-x86_64.rpm &&\
|
||||||
# Entrypoint scripts must be added to the entrypoint-scripts directory
|
rpm -vi filebeat-5.4.2-x86_64.rpm && rm filebeat-5.4.2-x86_64.rpm
|
||||||
RUN mkdir /entrypoint-scripts
|
|
||||||
|
|
||||||
COPY config/entrypoint.sh /entrypoint.sh
|
COPY config/filebeat.yml /etc/filebeat/
|
||||||
COPY --chown=root:wazuh config/create_user.py /var/ossec/framework/scripts/create_user.py
|
|
||||||
COPY config/00-decrypt_credentials.sh /entrypoint-scripts/00-decrypt_credentials.sh
|
|
||||||
COPY config/01-wazuh.sh /entrypoint-scripts/01-wazuh.sh
|
|
||||||
COPY config/02-set_filebeat_destination.sh /entrypoint-scripts/02-set_filebeat_destination.sh
|
|
||||||
COPY config/03-config_filebeat.sh /entrypoint-scripts/03-config_filebeat.sh
|
|
||||||
COPY config/20-ossec-configuration.sh /entrypoint-scripts/20-ossec-configuration.sh
|
|
||||||
COPY config/25-backups.sh /entrypoint-scripts/25-backups.sh
|
|
||||||
COPY config/35-remove_credentials_file.sh /entrypoint-scripts/35-remove_credentials_file.sh
|
|
||||||
COPY config/85-save_wazuh_version.sh /entrypoint-scripts/85-save_wazuh_version.sh
|
|
||||||
RUN chmod 755 /entrypoint.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/00-decrypt_credentials.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/01-wazuh.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/02-set_filebeat_destination.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/03-config_filebeat.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/20-ossec-configuration.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/25-backups.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/35-remove_credentials_file.sh && \
|
|
||||||
chmod 755 /entrypoint-scripts/85-save_wazuh_version.sh
|
|
||||||
|
|
||||||
# Load wazuh alerts template.
|
ADD config/run.sh /tmp/run.sh
|
||||||
#ADD https://raw.githubusercontent.com/wazuh/wazuh/$TEMPLATE_VERSION/extensions/elasticsearch/7.x/wazuh-template.json /etc/filebeat
|
RUN chmod 755 /tmp/run.sh
|
||||||
#RUN chmod go-w /etc/filebeat/wazuh-template.json
|
|
||||||
|
|
||||||
# Expose ports
|
VOLUME ["/var/ossec/data"]
|
||||||
EXPOSE 55000/tcp 1514/udp 1515/tcp 514/udp 1516/tcp
|
|
||||||
|
|
||||||
# Run all services
|
EXPOSE 55000/tcp 1514/udp 1515/tcp 514/udp
|
||||||
ENTRYPOINT ["/entrypoint.sh"]
|
|
||||||
|
|
||||||
|
# Run supervisord so that the container will stay alive
|
||||||
|
|
||||||
|
ENTRYPOINT ["/tmp/run.sh"]
|
||||||
|
|||||||
@@ -1,15 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Decrypt credentials.
|
|
||||||
# If the credentials of the API user to be created are encrypted,
|
|
||||||
# it must be decrypted for later use.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
if [[ "x${SECURITY_CREDENTIALS_FILE}" == "x" ]]; then
|
|
||||||
echo "CREDENTIALS - Security credentials file not used. Nothing to do."
|
|
||||||
else
|
|
||||||
echo "CREDENTIALS - TO DO"
|
|
||||||
fi
|
|
||||||
# TO DO
|
|
||||||
@@ -1,333 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh App Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Variables
|
|
||||||
source /permanent_data.env
|
|
||||||
|
|
||||||
WAZUH_INSTALL_PATH=/var/ossec
|
|
||||||
WAZUH_CONFIG_MOUNT=/wazuh-config-mount
|
|
||||||
AUTO_ENROLLMENT_ENABLED=${AUTO_ENROLLMENT_ENABLED:-true}
|
|
||||||
API_GENERATE_CERTS=${API_GENERATE_CERTS:-true}
|
|
||||||
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Aux functions
|
|
||||||
##############################################################################
|
|
||||||
print() {
|
|
||||||
echo -e $1
|
|
||||||
}
|
|
||||||
|
|
||||||
error_and_exit() {
|
|
||||||
echo "Error executing command: '$1'."
|
|
||||||
echo 'Exiting.'
|
|
||||||
exit 1
|
|
||||||
}
|
|
||||||
|
|
||||||
exec_cmd() {
|
|
||||||
eval $1 > /dev/null 2>&1 || error_and_exit "$1"
|
|
||||||
}
|
|
||||||
|
|
||||||
exec_cmd_stdout() {
|
|
||||||
eval $1 2>&1 || error_and_exit "$1"
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Check_update
|
|
||||||
# This function considers the following cases:
|
|
||||||
# - If /var/ossec/etc/VERSION does not exist -> Action Nothing. There is no data in the EBS. First time deploying Wazuh
|
|
||||||
# - If different Wazuh version -> Action: Update. The previous version is older than the current one.
|
|
||||||
# - If the same Wazuh version -> Acton: Nothing. Same Wazuh version.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
check_update() {
|
|
||||||
if [ -e /var/ossec/etc/VERSION ]
|
|
||||||
then
|
|
||||||
previous_version=$(cat /var/ossec/etc/VERSION | grep -i version | cut -d'"' -f2)
|
|
||||||
echo "CHECK UPDATE - Previous version: $previous_version"
|
|
||||||
current_version=$(/var/ossec/bin/wazuh-control -j info | jq .data[0].WAZUH_VERSION | cut -d'"' -f2)
|
|
||||||
echo "CHECK UPDATE - Current version: $current_version"
|
|
||||||
if [ $previous_version == $current_version ]
|
|
||||||
then
|
|
||||||
echo "CHECK UPDATE - Same Wazuh version in the EBS and image"
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
echo "CHECK UPDATE - Different Wazuh version: Update"
|
|
||||||
wazuh_version_regex='v4.2.[0-9]'
|
|
||||||
if [[ "$previous_version" =~ $wazuh_version_regex ]]
|
|
||||||
then
|
|
||||||
echo "CHECK UPDATE - Change ossec user to wazuh user"
|
|
||||||
ossec_group_files=$(find /var/ossec -group 1000)
|
|
||||||
ossec_user_files=$(find /var/ossec -user 1000)
|
|
||||||
|
|
||||||
while IFS= read -r group; do
|
|
||||||
chgrp wazuh $group
|
|
||||||
done <<< "$ossec_group_files"
|
|
||||||
|
|
||||||
while IFS= read -r user; do
|
|
||||||
chown wazuh $user
|
|
||||||
done <<< "$ossec_user_files"
|
|
||||||
|
|
||||||
echo "CHECK UPDATE - Change ossecr user to wazuh user"
|
|
||||||
ossecr_group_files=$(find /var/ossec -group 998)
|
|
||||||
ossecr_user_files=$(find /var/ossec -user 998)
|
|
||||||
|
|
||||||
while IFS= read -r group; do
|
|
||||||
chgrp wazuh $group
|
|
||||||
done <<< "$ossecr_group_files"
|
|
||||||
|
|
||||||
while IFS= read -r user; do
|
|
||||||
chown wazuh $user
|
|
||||||
done <<< "$ossecr_user_files"
|
|
||||||
|
|
||||||
echo "CHECK UPDATE - Change ossecm user to wazuh user"
|
|
||||||
ossecm_group_files=$(find /var/ossec -group 997)
|
|
||||||
ossecm_user_files=$(find /var/ossec -user 997)
|
|
||||||
|
|
||||||
while IFS= read -r group; do
|
|
||||||
chgrp wazuh $group
|
|
||||||
done <<< "$ossecm_group_files"
|
|
||||||
|
|
||||||
while IFS= read -r user; do
|
|
||||||
chown wazuh $user
|
|
||||||
done <<< "$ossecm_user_files"
|
|
||||||
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
echo "CHECK UPDATE - First time mounting EBS"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Edit configuration
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
edit_configuration() { # $1 -> setting, $2 -> value
|
|
||||||
sed -i "s/^config.$1\s=.*/config.$1 = \"$2\";/g" "${WAZUH_INSTALL_PATH}/api/configuration/config.js" || error_and_exit "sed (editing configuration)"
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# This function will attempt to mount every directory in PERMANENT_DATA
|
|
||||||
# into the respective path.
|
|
||||||
# If the path is empty means permanent data volume is also empty, so a backup
|
|
||||||
# will be copied into it. Otherwise it will not be copied because there is
|
|
||||||
# already data inside the volume for the specified path.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
mount_permanent_data() {
|
|
||||||
for permanent_dir in "${PERMANENT_DATA[@]}"; do
|
|
||||||
# Check if the path is not empty
|
|
||||||
if find ${permanent_dir} -mindepth 1 | read; then
|
|
||||||
print "The path ${permanent_dir} is already mounted"
|
|
||||||
else
|
|
||||||
print "Installing ${permanent_dir}"
|
|
||||||
exec_cmd "cp -a ${WAZUH_INSTALL_PATH}/data_tmp/permanent${permanent_dir}/. ${permanent_dir}"
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# This function will replace from the permanent data volume every file
|
|
||||||
# contained in PERMANENT_DATA_EXCP
|
|
||||||
# Some files as 'internal_options.conf' are saved as permanent data, but
|
|
||||||
# they must be updated to work properly if wazuh version is changed.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
apply_exclusion_data() {
|
|
||||||
for exclusion_file in "${PERMANENT_DATA_EXCP[@]}"; do
|
|
||||||
if [ -e ${WAZUH_INSTALL_PATH}/data_tmp/exclusion/${exclusion_file} ]
|
|
||||||
then
|
|
||||||
DIR=$(dirname "${exclusion_file}")
|
|
||||||
if [ ! -e ${DIR} ]
|
|
||||||
then
|
|
||||||
mkdir -p ${DIR}
|
|
||||||
fi
|
|
||||||
|
|
||||||
print "Updating ${exclusion_file}"
|
|
||||||
exec_cmd "cp -p ${WAZUH_INSTALL_PATH}/data_tmp/exclusion/${exclusion_file} ${exclusion_file}"
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# This function will delete from the permanent data volume every file
|
|
||||||
# contained in PERMANENT_DATA_DEL
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
remove_data_files() {
|
|
||||||
for del_file in "${PERMANENT_DATA_DEL[@]}"; do
|
|
||||||
if [ $(ls ${del_file} 2> /dev/null | wc -l) -ne 0 ]
|
|
||||||
then
|
|
||||||
print "Removing ${del_file}"
|
|
||||||
exec_cmd "rm ${del_file}"
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Create certificates: Manager
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
create_ossec_key_cert() {
|
|
||||||
print "Creating wazuh-authd key and cert"
|
|
||||||
exec_cmd "openssl genrsa -out ${WAZUH_INSTALL_PATH}/etc/sslmanager.key 4096"
|
|
||||||
exec_cmd "openssl req -new -x509 -key ${WAZUH_INSTALL_PATH}/etc/sslmanager.key -out ${WAZUH_INSTALL_PATH}/etc/sslmanager.cert -days 3650 -subj /CN=${HOSTNAME}/"
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Copy all files from $WAZUH_CONFIG_MOUNT to $WAZUH_INSTALL_PATH and respect
|
|
||||||
# destination files permissions
|
|
||||||
#
|
|
||||||
# For example, to mount the file /var/ossec/data/etc/ossec.conf, mount it at
|
|
||||||
# $WAZUH_CONFIG_MOUNT/etc/ossec.conf in your container and this code will
|
|
||||||
# replace the ossec.conf file in /var/ossec/data/etc with yours.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
mount_files() {
|
|
||||||
if [ -e "$WAZUH_CONFIG_MOUNT" ]
|
|
||||||
then
|
|
||||||
print "Identified Wazuh configuration files to mount..."
|
|
||||||
exec_cmd_stdout "cp --verbose -r $WAZUH_CONFIG_MOUNT/* $WAZUH_INSTALL_PATH"
|
|
||||||
else
|
|
||||||
print "No Wazuh configuration files to mount..."
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Stop OSSEC
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
function ossec_shutdown(){
|
|
||||||
${WAZUH_INSTALL_PATH}/bin/wazuh-control stop;
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Interpret any passed arguments (via docker command to this entrypoint) as
|
|
||||||
# paths or commands, and execute them.
|
|
||||||
#
|
|
||||||
# This can be useful for actions that need to be run before the services are
|
|
||||||
# started, such as "/var/ossec/bin/wazuh-control enable agentless".
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
docker_custom_args() {
|
|
||||||
for CUSTOM_COMMAND in "$@"
|
|
||||||
do
|
|
||||||
echo "Executing command \`${CUSTOM_COMMAND}\`"
|
|
||||||
exec_cmd_stdout "${CUSTOM_COMMAND}"
|
|
||||||
done
|
|
||||||
}
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Change Wazuh API user credentials.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
|
|
||||||
function_create_custom_user() {
|
|
||||||
|
|
||||||
# get custom credentials
|
|
||||||
if [[ "x${SECURITY_CREDENTIALS_FILE}" == "x" ]]; then
|
|
||||||
echo "No security credentials file used"
|
|
||||||
else
|
|
||||||
input=${SECURITY_CREDENTIALS_FILE}
|
|
||||||
while IFS= read -r line
|
|
||||||
do
|
|
||||||
if [[ $line == *"WUI_API_PASS"* ]]; then
|
|
||||||
arrIN=(${line//:/ })
|
|
||||||
WUI_API_PASS=${arrIN[1]}
|
|
||||||
elif [[ $line == *"WAZUH_API_PASS"* ]]; then
|
|
||||||
arrIN=(${line//:/ })
|
|
||||||
WAZUH_API_PASS=${arrIN[1]}
|
|
||||||
fi
|
|
||||||
done < "$input"
|
|
||||||
fi
|
|
||||||
|
|
||||||
|
|
||||||
if [[ ! -z $WAZUH_API_PASS ]]; then
|
|
||||||
cat << EOF > "/var/ossec/api/configuration/wazuh-user.json"
|
|
||||||
{
|
|
||||||
"password": "$WAZUH_API_PASS"
|
|
||||||
}
|
|
||||||
EOF
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ ! -z $WUI_API_PASS ]]; then
|
|
||||||
cat << EOF > "/var/ossec/api/configuration/wui-user.json"
|
|
||||||
{
|
|
||||||
"password": "$WUI_API_PASS"
|
|
||||||
}
|
|
||||||
EOF
|
|
||||||
|
|
||||||
# create or customize API user
|
|
||||||
if /var/ossec/framework/python/bin/python3 /var/ossec/framework/scripts/create_user.py; then
|
|
||||||
# remove json if exit code is 0
|
|
||||||
echo "Wazuh API credentials changed"
|
|
||||||
rm /var/ossec/api/configuration/wui-user.json
|
|
||||||
rm /var/ossec/api/configuration/wazuh-user.json
|
|
||||||
else
|
|
||||||
echo "There was an error configuring the API users"
|
|
||||||
sleep 10
|
|
||||||
# terminate container to avoid unpredictable behavior
|
|
||||||
kill -s SIGINT 1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Main function
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
main() {
|
|
||||||
|
|
||||||
# Check Wazuh version in the image and EBS (It returns 1 when updating the environment)
|
|
||||||
check_update
|
|
||||||
update=$?
|
|
||||||
|
|
||||||
# Mount permanent data (i.e. ossec.conf)
|
|
||||||
mount_permanent_data
|
|
||||||
|
|
||||||
# Restore files stored in permanent data that are not permanent (i.e. internal_options.conf)
|
|
||||||
apply_exclusion_data
|
|
||||||
|
|
||||||
# When updating the environment, remove some files in permanent_data (i.e. .template.db)
|
|
||||||
if [ $update == 1 ]
|
|
||||||
then
|
|
||||||
echo "Removing databases"
|
|
||||||
remove_data_files
|
|
||||||
else
|
|
||||||
echo "Keeping databases"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Generate wazuh-authd certs if AUTO_ENROLLMENT_ENABLED is true and does not exist
|
|
||||||
if [ $AUTO_ENROLLMENT_ENABLED == true ]
|
|
||||||
then
|
|
||||||
if [ ! -e ${WAZUH_INSTALL_PATH}/etc/sslmanager.key ]
|
|
||||||
then
|
|
||||||
create_ossec_key_cert
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Mount selected files (WAZUH_CONFIG_MOUNT) to container
|
|
||||||
mount_files
|
|
||||||
|
|
||||||
# Trap exit signals and do a proper shutdown
|
|
||||||
trap "ossec_shutdown; exit" SIGINT SIGTERM
|
|
||||||
|
|
||||||
# Execute custom args
|
|
||||||
docker_custom_args
|
|
||||||
|
|
||||||
# Change API user credentials
|
|
||||||
if [[ ${CLUSTER_NODE_TYPE} == "master" ]]; then
|
|
||||||
function_create_custom_user
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Delete temporary data folder
|
|
||||||
rm -rf ${WAZUH_INSTALL_PATH}/data_tmp
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
main
|
|
||||||
@@ -1,30 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Set Filebeat destination.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
if [[ $FILEBEAT_DESTINATION == "elasticsearch" ]]; then
|
|
||||||
|
|
||||||
echo "FILEBEAT - Set destination to Elasticsearch"
|
|
||||||
cp filebeat_to_elasticsearch.yml /etc/filebeat/filebeat.yml
|
|
||||||
if [[ $FILEBEAT_OUTPUT != "" ]]; then
|
|
||||||
sed -i "s/elasticsearch:9200/$FILEBEAT_OUTPUT:9200/" /etc/filebeat/filebeat.yml
|
|
||||||
fi
|
|
||||||
|
|
||||||
elif [[ $FILEBEAT_DESTINATION == "logstash" ]]; then
|
|
||||||
|
|
||||||
echo "FILEBEAT - Set destination to Logstash"
|
|
||||||
cp filebeat_to_logstash.yml /etc/filebeat/filebeat.yml
|
|
||||||
if [[ $FILEBEAT_OUTPUT != "" ]]; then
|
|
||||||
sed -i "s/logstash:5000/$FILEBEAT_OUTPUT:5000/" /etc/filebeat/filebeat.yml
|
|
||||||
fi
|
|
||||||
|
|
||||||
else
|
|
||||||
echo "FILEBEAT - Error choosing destination. Set default filebeat.yml "
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "FILEBEAT - Set permissions"
|
|
||||||
|
|
||||||
chmod go-w /etc/filebeat/filebeat.yml
|
|
||||||
@@ -1,23 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh App Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
if [[ $FILEBEAT_DESTINATION == "elasticsearch" ]]; then
|
|
||||||
|
|
||||||
WAZUH_FILEBEAT_MODULE=wazuh-filebeat-0.1.tar.gz
|
|
||||||
|
|
||||||
# Modify the output to Elasticsearch if th ELASTICSEARCH_URL is set
|
|
||||||
if [ "$ELASTICSEARCH_URL" != "" ]; then
|
|
||||||
>&2 echo "FILEBEAT - Customize Elasticsearch ouput IP."
|
|
||||||
sed -i 's|http://elasticsearch:9200|'$ELASTICSEARCH_URL'|g' /etc/filebeat/filebeat.yml
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Install Wazuh Filebeat Module
|
|
||||||
|
|
||||||
>&2 echo "FILEBEAT - Install Wazuh Filebeat Module."
|
|
||||||
curl -s "https://packages.wazuh.com/3.x/filebeat/${WAZUH_FILEBEAT_MODULE}" | tar -xvz -C /usr/share/filebeat/module
|
|
||||||
mkdir -p /usr/share/filebeat/module/wazuh
|
|
||||||
chmod 755 -R /usr/share/filebeat/module/wazuh
|
|
||||||
|
|
||||||
fi
|
|
||||||
@@ -1,13 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Change Wazuh manager configuration.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
# # Example:
|
|
||||||
# # Change remote protocol from udp to tcp
|
|
||||||
# PROTOCOL="tcp"
|
|
||||||
# sed -i -e '/<remote>/,/<\/remote>/ s|<protocol>udp</protocol>|<protocol>'$PROTOCOL'</protocol>|g' /var/ossec/etc/ossec.conf
|
|
||||||
# # It is necessary to restart the service in order to apply the new configuration.
|
|
||||||
# service wazuh-manager restart
|
|
||||||
@@ -1,10 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Enable Wazuh backups and store them in a repository.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
|
|
||||||
# TO DO
|
|
||||||
echo "BACKUPS - TO DO"
|
|
||||||
@@ -1,14 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Decrypt credentials.
|
|
||||||
# Remove the credentials file for security reasons.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
if [[ "x${SECURITY_CREDENTIALS_FILE}" == "x" ]]; then
|
|
||||||
echo "CREDENTIALS - Security credentials file not used. Nothing to do."
|
|
||||||
else
|
|
||||||
echo "CREDENTIALS - Remove credentiasl file."
|
|
||||||
shred -zvu ${SECURITY_CREDENTIALS_FILE}
|
|
||||||
fi
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2020 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Copy /var/ossec/etc/ossec-init.conf contents in /var/ossec/etc/VERSION to be able to check the previous Wazuh version in pod.
|
|
||||||
echo "Adding Wazuh version to /var/ossec/etc/VERSION"
|
|
||||||
/var/ossec/bin/wazuh-control info > /var/ossec/etc/VERSION
|
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
import logging
|
|
||||||
import sys
|
|
||||||
import json
|
|
||||||
import random
|
|
||||||
import string
|
|
||||||
import os
|
|
||||||
import re
|
|
||||||
# Set framework path
|
|
||||||
sys.path.append(os.path.dirname(sys.argv[0]) + "/../framework")
|
|
||||||
WUI_USER_FILE_PATH = "/var/ossec/api/configuration/wui-user.json"
|
|
||||||
WAZUH_USER_FILE_PATH = "/var/ossec/api/configuration/wazuh-user.json"
|
|
||||||
|
|
||||||
try:
|
|
||||||
from wazuh.rbac.orm import check_database_integrity
|
|
||||||
from wazuh.security import (
|
|
||||||
create_user,
|
|
||||||
get_users,
|
|
||||||
get_roles,
|
|
||||||
set_user_role,
|
|
||||||
update_user,
|
|
||||||
)
|
|
||||||
except Exception as e:
|
|
||||||
logging.error("No module 'wazuh' found.")
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
def read_wui_user_file(path=WUI_USER_FILE_PATH):
|
|
||||||
with open(path) as wui_user_file:
|
|
||||||
data = json.load(wui_user_file)
|
|
||||||
return data["password"]
|
|
||||||
|
|
||||||
def read_wazuh_user_file(path=WAZUH_USER_FILE_PATH):
|
|
||||||
with open(path) as wazuh_user_file:
|
|
||||||
data = json.load(wazuh_user_file)
|
|
||||||
return data["password"]
|
|
||||||
|
|
||||||
def db_users():
|
|
||||||
users_result = get_users()
|
|
||||||
return {user["username"]: user["id"] for user in users_result.affected_items}
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
if not os.path.exists(WUI_USER_FILE_PATH):
|
|
||||||
# abort if no user file detected
|
|
||||||
sys.exit(0)
|
|
||||||
|
|
||||||
wui_password = read_wui_user_file()
|
|
||||||
wazuh_password = read_wazuh_user_file()
|
|
||||||
check_database_integrity()
|
|
||||||
initial_users = db_users()
|
|
||||||
|
|
||||||
# set a random password for all other users (not wazuh-wui)
|
|
||||||
for name, id in initial_users.items():
|
|
||||||
custom_pass = None
|
|
||||||
if name == "wazuh-wui":
|
|
||||||
custom_pass = wui_password
|
|
||||||
elif name == "wazuh":
|
|
||||||
custom_pass = wazuh_password
|
|
||||||
if custom_pass:
|
|
||||||
update_user(
|
|
||||||
user_id=[
|
|
||||||
str(id),
|
|
||||||
],
|
|
||||||
password=custom_pass,
|
|
||||||
)
|
|
||||||
9
wazuh/config/data_dirs.env
Normal file
9
wazuh/config/data_dirs.env
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
i=0
|
||||||
|
DATA_DIRS[((i++))]="etc"
|
||||||
|
DATA_DIRS[((i++))]="ruleset"
|
||||||
|
DATA_DIRS[((i++))]="logs"
|
||||||
|
DATA_DIRS[((i++))]="stats"
|
||||||
|
DATA_DIRS[((i++))]="queue"
|
||||||
|
DATA_DIRS[((i++))]="var/db"
|
||||||
|
DATA_DIRS[((i++))]="api"
|
||||||
|
export DATA_DIRS
|
||||||
@@ -1,15 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Trap to kill container if it is necessary.
|
|
||||||
trap "exit" SIGINT SIGTERM
|
|
||||||
# It will run every .sh script located in entrypoint-scripts folder in lexicographical order
|
|
||||||
for script in `ls /entrypoint-scripts/*.sh | sort -n`; do
|
|
||||||
bash "$script"
|
|
||||||
done
|
|
||||||
|
|
||||||
##############################################################################
|
|
||||||
# Start Wazuh Server.
|
|
||||||
##############################################################################
|
|
||||||
|
|
||||||
/sbin/my_init
|
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
/etc/init.d/filebeat start
|
|
||||||
tail -f /var/log/filebeat/filebeat
|
|
||||||
16
wazuh/config/filebeat.yml
Normal file
16
wazuh/config/filebeat.yml
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
filebeat:
|
||||||
|
prospectors:
|
||||||
|
- input_type: log
|
||||||
|
paths:
|
||||||
|
- "/var/ossec/data/logs/alerts/alerts.json"
|
||||||
|
document_type: wazuh-alerts
|
||||||
|
json.message_key: log
|
||||||
|
json.keys_under_root: true
|
||||||
|
json.overwrite_keys: true
|
||||||
|
|
||||||
|
output:
|
||||||
|
logstash:
|
||||||
|
# The Logstash hosts
|
||||||
|
hosts: ["logstash:5000"]
|
||||||
|
# ssl:
|
||||||
|
# certificate_authorities: ["/etc/filebeat/logstash.crt"]
|
||||||
@@ -1,55 +0,0 @@
|
|||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Wazuh - Filebeat configuration file
|
|
||||||
filebeat.inputs:
|
|
||||||
- type: log
|
|
||||||
paths:
|
|
||||||
- '/var/ossec/logs/alerts/alerts.json'
|
|
||||||
|
|
||||||
setup.template.json.enabled: true
|
|
||||||
setup.template.json.path: "/etc/filebeat/wazuh-template.json"
|
|
||||||
setup.template.json.name: "wazuh"
|
|
||||||
setup.template.overwrite: true
|
|
||||||
|
|
||||||
processors:
|
|
||||||
- decode_json_fields:
|
|
||||||
fields: ['message']
|
|
||||||
process_array: true
|
|
||||||
max_depth: 200
|
|
||||||
target: ''
|
|
||||||
overwrite_keys: true
|
|
||||||
- drop_fields:
|
|
||||||
fields: ['message', 'ecs', 'beat', 'input_type', 'tags', 'count', '@version', 'log', 'offset', 'type', 'host']
|
|
||||||
- rename:
|
|
||||||
fields:
|
|
||||||
- from: "data.aws.sourceIPAddress"
|
|
||||||
to: "@src_ip"
|
|
||||||
ignore_missing: true
|
|
||||||
fail_on_error: false
|
|
||||||
when:
|
|
||||||
regexp:
|
|
||||||
data.aws.sourceIPAddress: \b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b
|
|
||||||
- rename:
|
|
||||||
fields:
|
|
||||||
- from: "data.srcip"
|
|
||||||
to: "@src_ip"
|
|
||||||
ignore_missing: true
|
|
||||||
fail_on_error: false
|
|
||||||
when:
|
|
||||||
regexp:
|
|
||||||
data.srcip: \b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b
|
|
||||||
- rename:
|
|
||||||
fields:
|
|
||||||
- from: "data.win.eventdata.ipAddress"
|
|
||||||
to: "@src_ip"
|
|
||||||
ignore_missing: true
|
|
||||||
fail_on_error: false
|
|
||||||
when:
|
|
||||||
regexp:
|
|
||||||
data.win.eventdata.ipAddress: \b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b
|
|
||||||
|
|
||||||
output.elasticsearch:
|
|
||||||
hosts: ['http://elasticsearch:9200']
|
|
||||||
#pipeline: geoip
|
|
||||||
indices:
|
|
||||||
- index: 'wazuh-alerts-4.x-%{+yyyy.MM.dd}'
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Wazuh - Filebeat configuration file
|
|
||||||
filebeat:
|
|
||||||
inputs:
|
|
||||||
- type: log
|
|
||||||
paths:
|
|
||||||
- "/var/ossec/logs/alerts/alerts.json"
|
|
||||||
# - type: log
|
|
||||||
# paths:
|
|
||||||
# - "/var/ossec/logs/archives/archives.json"
|
|
||||||
# fields:
|
|
||||||
# wazuh_log_file: "archives"
|
|
||||||
|
|
||||||
output:
|
|
||||||
logstash:
|
|
||||||
# The Logstash hosts
|
|
||||||
hosts: ["logstash:5000"]
|
|
||||||
# ssl:
|
|
||||||
# certificate_authorities: ["/etc/filebeat/logstash.crt"]
|
|
||||||
12
wazuh/config/init.bash
Normal file
12
wazuh/config/init.bash
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
#
|
||||||
|
# Initialize the custom data directory layout
|
||||||
|
#
|
||||||
|
source /data_dirs.env
|
||||||
|
|
||||||
|
cd /var/ossec
|
||||||
|
for ossecdir in "${DATA_DIRS[@]}"; do
|
||||||
|
mv ${ossecdir} ${ossecdir}-template
|
||||||
|
ln -s $(realpath --relative-to=$(dirname ${ossecdir}) data)/${ossecdir} ${ossecdir}
|
||||||
|
done
|
||||||
@@ -1,83 +0,0 @@
|
|||||||
# Permanent data mounted in volumes
|
|
||||||
i=0
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/api/configuration"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/etc"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/logs"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/queue"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/agentless"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/var/multigroups"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/integrations"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/active-response/bin"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/wodles"
|
|
||||||
PERMANENT_DATA[((i++))]="/etc/filebeat"
|
|
||||||
PERMANENT_DATA[((i++))]="/etc/postfix"
|
|
||||||
PERMANENT_DATA[((i++))]="/var/ossec/var/db"
|
|
||||||
export PERMANENT_DATA
|
|
||||||
|
|
||||||
# Files mounted in a volume that should not be permanent
|
|
||||||
i=0
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/etc/internal_options.conf"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/pagerduty"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/slack"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/slack.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/virustotal"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/virustotal.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/shuffle.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/integrations/shuffle"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/default-firewall-drop"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/disable-account"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/firewalld-drop"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/firewall-drop"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/host-deny"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/ip-customblock"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/ipfw"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/kaspersky.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/kaspersky"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/npf"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/wazuh-slack"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/pf"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/restart-wazuh"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/restart.sh"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/active-response/bin/route-null"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/sshlogin.exp"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_pixconfig_diff"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_asa-fwsmconfig_diff"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_integrity_check_bsd"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/main.exp"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/su.exp"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_integrity_check_linux"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/register_host.sh"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_generic_diff"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_foundry_diff"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh_nopass.exp"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/agentless/ssh.exp"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/aws/aws-s3"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/aws/aws-s3.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/azure/azure-logs"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/azure/azure-logs.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/azure/orm.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/docker/DockerListener"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/docker/DockerListener.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/gcloud"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/gcloud.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/buckets/access_logs.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/buckets/bucket.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/pubsub/subscriber.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/integration.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/tools.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/gcloud/exceptions.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/wodles/utils.py"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/queue/vulnerabilities/dictionaries/cpe_helper.json"
|
|
||||||
PERMANENT_DATA_EXCP[((i++))]="/var/ossec/var/db/mitre.db"
|
|
||||||
export PERMANENT_DATA_EXCP
|
|
||||||
|
|
||||||
# Files mounted in a volume that should be deleted when updating
|
|
||||||
i=0
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/queue/db/.template.db"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/var/db/.profile.db*"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/var/db/.template.db*"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/var/db/agents/*"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/wodles/cve.db"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/queue/vulnerabilities/cve.db"
|
|
||||||
PERMANENT_DATA_DEL[((i++))]="/var/ossec/queue/fim/db/fim.db"
|
|
||||||
export PERMANENT_DATA_DEL
|
|
||||||
@@ -1,40 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# Wazuh App Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
|
|
||||||
# Variables
|
|
||||||
source /permanent_data.env
|
|
||||||
|
|
||||||
WAZUH_INSTALL_PATH=/var/ossec
|
|
||||||
DATA_TMP_PATH=${WAZUH_INSTALL_PATH}/data_tmp
|
|
||||||
mkdir ${DATA_TMP_PATH}
|
|
||||||
|
|
||||||
# Move exclusion files to EXCLUSION_PATH
|
|
||||||
EXCLUSION_PATH=${DATA_TMP_PATH}/exclusion
|
|
||||||
mkdir ${EXCLUSION_PATH}
|
|
||||||
|
|
||||||
for exclusion_file in "${PERMANENT_DATA_EXCP[@]}"; do
|
|
||||||
# Create the directory for the exclusion file if it does not exist
|
|
||||||
DIR=$(dirname "${exclusion_file}")
|
|
||||||
if [ ! -e ${EXCLUSION_PATH}/${DIR} ]
|
|
||||||
then
|
|
||||||
mkdir -p ${EXCLUSION_PATH}/${DIR}
|
|
||||||
fi
|
|
||||||
|
|
||||||
mv ${exclusion_file} ${EXCLUSION_PATH}/${exclusion_file}
|
|
||||||
done
|
|
||||||
|
|
||||||
# Move permanent files to PERMANENT_PATH
|
|
||||||
PERMANENT_PATH=${DATA_TMP_PATH}/permanent
|
|
||||||
mkdir ${PERMANENT_PATH}
|
|
||||||
|
|
||||||
for permanent_dir in "${PERMANENT_DATA[@]}"; do
|
|
||||||
# Create the directory for the permanent file if it does not exist
|
|
||||||
DIR=$(dirname "${permanent_dir}")
|
|
||||||
if [ ! -e ${PERMANENT_PATH}${DIR} ]
|
|
||||||
then
|
|
||||||
mkdir -p ${PERMANENT_PATH}${DIR}
|
|
||||||
fi
|
|
||||||
|
|
||||||
mv ${permanent_dir} ${PERMANENT_PATH}${permanent_dir}
|
|
||||||
|
|
||||||
done
|
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
/usr/sbin/postfix start
|
|
||||||
tail -f /var/log/mail.log
|
|
||||||
79
wazuh/config/run.sh
Normal file
79
wazuh/config/run.sh
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
#
|
||||||
|
# OSSEC container bootstrap. See the README for information of the environment
|
||||||
|
# variables expected by this script.
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
# Startup the services
|
||||||
|
#
|
||||||
|
|
||||||
|
source /data_dirs.env
|
||||||
|
FIRST_TIME_INSTALLATION=false
|
||||||
|
DATA_PATH=/var/ossec/data
|
||||||
|
|
||||||
|
for ossecdir in "${DATA_DIRS[@]}"; do
|
||||||
|
if [ ! -e "${DATA_PATH}/${ossecdir}" ]
|
||||||
|
then
|
||||||
|
echo "Installing ${ossecdir}"
|
||||||
|
mkdir -p $(dirname ${DATA_PATH}/${ossecdir})
|
||||||
|
cp -pr /var/ossec/${ossecdir}-template ${DATA_PATH}/${ossecdir}
|
||||||
|
FIRST_TIME_INSTALLATION=true
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
touch ${DATA_PATH}/process_list
|
||||||
|
chgrp ossec ${DATA_PATH}/process_list
|
||||||
|
chmod g+rw ${DATA_PATH}/process_list
|
||||||
|
|
||||||
|
AUTO_ENROLLMENT_ENABLED=${AUTO_ENROLLMENT_ENABLED:-true}
|
||||||
|
|
||||||
|
if [ $FIRST_TIME_INSTALLATION == true ]
|
||||||
|
then
|
||||||
|
|
||||||
|
if [ $AUTO_ENROLLMENT_ENABLED == true ]
|
||||||
|
then
|
||||||
|
if [ ! -e ${DATA_PATH}/etc/sslmanager.key ]
|
||||||
|
then
|
||||||
|
echo "Creating ossec-authd key and cert"
|
||||||
|
openssl genrsa -out ${DATA_PATH}/etc/sslmanager.key 4096
|
||||||
|
openssl req -new -x509 -key ${DATA_PATH}/etc/sslmanager.key\
|
||||||
|
-out ${DATA_PATH}/etc/sslmanager.cert -days 3650\
|
||||||
|
-subj /CN=${HOSTNAME}/
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
function ossec_shutdown(){
|
||||||
|
/var/ossec/bin/ossec-control stop;
|
||||||
|
if [ $AUTO_ENROLLMENT_ENABLED == true ]
|
||||||
|
then
|
||||||
|
kill $AUTHD_PID
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Trap exit signals and do a proper shutdown
|
||||||
|
trap "ossec_shutdown; exit" SIGINT SIGTERM
|
||||||
|
|
||||||
|
chmod -R g+rw ${DATA_PATH}
|
||||||
|
|
||||||
|
if [ $AUTO_ENROLLMENT_ENABLED == true ]
|
||||||
|
then
|
||||||
|
echo "Starting ossec-authd..."
|
||||||
|
/var/ossec/bin/ossec-authd -p 1515 -g ossec $AUTHD_OPTIONS >/dev/null 2>&1 &
|
||||||
|
AUTHD_PID=$!
|
||||||
|
fi
|
||||||
|
sleep 15 # give ossec a reasonable amount of time to start before checking status
|
||||||
|
LAST_OK_DATE=`date +%s`
|
||||||
|
|
||||||
|
## Start services
|
||||||
|
/usr/sbin/postfix start
|
||||||
|
/bin/node /var/ossec/api/app.js &
|
||||||
|
/usr/bin/filebeat.sh &
|
||||||
|
/var/ossec/bin/ossec-control restart
|
||||||
|
|
||||||
|
|
||||||
|
tail -f /var/ossec/logs/ossec.log
|
||||||
7
wazuh/config/wazuh.repo
Normal file
7
wazuh/config/wazuh.repo
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
[wazuh_repo]
|
||||||
|
gpgcheck=1
|
||||||
|
gpgkey=https://packages.wazuh.com/key/GPG-KEY-WAZUH
|
||||||
|
enabled=1
|
||||||
|
name=CENTOS-$releasever - Wazuh
|
||||||
|
baseurl=https://packages.wazuh.com/yum/el/$releasever/$basearch
|
||||||
|
protect=1
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
# Wazuh Docker Copyright (C) 2019 Wazuh Inc. (License GPLv2)
|
|
||||||
/etc/init.d/wazuh-manager start
|
|
||||||
tail -f /var/ossec/logs/ossec.log
|
|
||||||
Reference in New Issue
Block a user