feat(S3_in_w_x_flags): Support S3 URIs for custom checks paths and whitelist files. (#1090)

* feat(S3_in_w_x_flags): Support S3 URIs for custom checks paths and whitelist files.

* feat(S3_in_w_x_flags): README document was updated.

* Update README.md

* Update README.md

* Update README.md

* Update README.md

Co-authored-by: Toni de la Fuente <toni@blyx.com>
Co-authored-by: Sergio Garcia Garcia
This commit is contained in:
Sergio Garcia
2022-04-07 14:37:02 -04:00
committed by GitHub
parent 07b2b0de5a
commit 28fff104a1
4 changed files with 118 additions and 18 deletions

View File

@@ -292,11 +292,12 @@ Prowler has two parameters related to regions: `-r` that is used query AWS servi
>Note about output formats to use with `-M`: "text" is the default one with colors, "mono" is like default one but monochrome, "csv" is comma separated values, "json" plain basic json (without comma between lines) and "json-asff" is also json with Amazon Security Finding Format that you can ship to Security Hub using `-S`.
or save your report in an S3 bucket (this only works for text or mono. For csv, json or json-asff it has to be copied afterwards):
To save your report in an S3 bucket, use `-B` to define a custom output bucket along with `-M` to define the output format that is going to be uploaded to S3:
```sh
./prowler -M mono | aws s3 cp - s3://bucket-name/prowler-report.txt
./prowler -M csv -B my-bucket/folder/
```
>In the case you do not want to use the assumed role credentials but the initial credentials to put the reports into the S3 bucket, use `-D` instead of `-B`. Make sure that the used credentials have s3:PutObject permissions in the S3 path where the reports are going to be uploaded.
When generating multiple formats and running using Docker, to retrieve the reports, bind a local directory to the container, e.g.:
@@ -399,7 +400,10 @@ Prowler runs in GovCloud regions as well. To make sure it points to the right AP
### Custom folder for custom checks
Flag `-x /my/own/checks` will include any check in that particular directory. To see how to write checks see [Add Custom Checks](#add-custom-checks) section.
Flag `-x /my/own/checks` will include any check in that particular directory (files must start by check). To see how to write checks see [Add Custom Checks](#add-custom-checks) section.
S3 URIs are also supported as custom folders for custom checks, e.g. `s3://bucket/prefix/checks`. Prowler will download the folder locally and run the checks as they are called with default execution,`-c` or `-g`.
>Make sure that the used credentials have s3:GetObject permissions in the S3 path where the custom checks are located.
### Show or log only FAILs
@@ -488,6 +492,9 @@ Sometimes you may find resources that are intentionally configured in a certain
./prowler -w whitelist_sample.txt
```
S3 URIs are also supported as allowlist file, e.g. `s3://bucket/prefix/allowlist_sample.txt`
>Make sure that the used credentials have s3:GetObject permissions in the S3 path where the whitelist file is located.
Whitelist option works along with other options and adds a `WARNING` instead of `INFO`, `PASS` or `FAIL` to any output format except for `json-asff`.
## How to fix every FAIL

43
include/allowlist Normal file
View File

@@ -0,0 +1,43 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
allowlist(){
# check if the file is an S3 URI
if grep -q -E "^s3://([^/]+)/(.*?([^/]+))$" <<< "$ALLOWLIST_FILE"; then
# download s3 object
local S3_ALLOWLIST_FILE=allowlist_s3_file.txt
echo -e "$NOTICE Downloading allowlist from S3 URI $ALLOWLIST_FILE ..."
if ! $AWSCLI s3 cp $ALLOWLIST_FILE $S3_ALLOWLIST_FILE $PROFILE_OPT > /dev/null 2>&1; then
echo "$BAD FAIL! Access Denied trying to download allowlist from the S3 URI, please make sure it is correct and/or you have permissions to get the S3 object.$NORMAL"
EXITCODE=1
exit $EXITCODE
fi
echo -e "$OK Success! Allowlist was downloaded, starting Prowler...$NORMAL"
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$S3_ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
# remove temporary file
rm -f "$S3_ALLOWLIST_FILE"
else
# Check if input allowlist file exists
if [[ -f "$ALLOWLIST_FILE" ]]; then
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
else
echo "$BAD FAIL! $ALLOWLIST_FILE does not exist, please input a valid allowlist file.$NORMAL"
EXITCODE=1
exit $EXITCODE
fi
fi
}

52
include/custom_checks Normal file
View File

@@ -0,0 +1,52 @@
#!/usr/bin/env bash
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.
custom_checks(){
# check if the path is an S3 URI
if grep -q -E "^s3://([^/]+)/?(.*?([^/]+)/?)?$" <<< "$EXTERNAL_CHECKS_PATH"; then
if grep -q "check*" <<< "$("${AWSCLI}" s3 ls "${EXTERNAL_CHECKS_PATH}" $PROFILE_OPT)"; then
# download s3 object
echo -e "$NOTICE Downloading custom checks from S3 URI $EXTERNAL_CHECKS_PATH...$NORMAL"
S3_CHECKS_TEMP_FOLDER="$PROWLER_DIR/s3-custom-checks"
mkdir "${S3_CHECKS_TEMP_FOLDER}"
$AWSCLI s3 sync "$EXTERNAL_CHECKS_PATH" "${S3_CHECKS_TEMP_FOLDER}" $PROFILE_OPT > /dev/null
# verify if there are checks
for checks in "${S3_CHECKS_TEMP_FOLDER}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
done
echo -e "$OK Success! Custom checks were downloaded and included, starting Prowler...$NORMAL"
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
else
echo "$BAD FAIL! Access Denied trying to download custom checks or $EXTERNAL_CHECKS_PATH does not contain any checks, please make sure it is correct and/or you have permissions to get the S3 objects.$NORMAL"
EXITCODE=1
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
exit $EXITCODE
fi
else
# verify if input directory exists with checks
if ls "${EXTERNAL_CHECKS_PATH}"/check* > /dev/null 2>&1; then
for checks in "${EXTERNAL_CHECKS_PATH}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
done
echo -e "$OK Success! Custom checks were included, starting Prowler...$NORMAL"
else
echo "$BAD FAIL! $EXTERNAL_CHECKS_PATH does not exist or not contain checks, please input a valid custom checks path.$NORMAL"
EXITCODE=1
exit $EXITCODE
fi
fi
}

28
prowler
View File

@@ -43,7 +43,7 @@ FAILED_CHECK_FAILED_SCAN=1
PROWLER_START_TIME=$( date -u +"%Y-%m-%dT%H:%M:%S%z" )
TITLE_ID=""
TITLE_TEXT="CALLER ERROR - UNSET TITLE"
WHITELIST_FILE=""
ALLOWLIST_FILE=""
TOTAL_CHECKS=()
# Ensures command output will always be set to JSON.
@@ -88,8 +88,8 @@ USAGE:
-s Show scoring report (it is included by default in the html report).
-S Send check output to AWS Security Hub. Only valid when the output mode is json-asff
(i.e. "-M json-asff -S").
-x Specify external directory with custom checks
(i.e. /my/own/checks, files must start by "check").
-x Specify external directory with custom checks. S3 URI is supported.
(i.e. /my/own/checks or s3://bucket/prefix/checks, files must start by "check").
-q Get only FAIL findings, will show WARNINGS when a resource is excluded.
-A Account id for the account where to assume a role, requires -R.
(i.e.: 123456789012)
@@ -98,8 +98,8 @@ USAGE:
-T Session duration given to that role credentials in seconds, default 1h (3600) recommended 12h, optional with -R and -A.
(i.e.: 43200)
-I External ID to be used when assuming roles (not mandatory), requires -A and -R.
-w Whitelist file. See whitelist_sample.txt for reference and format.
(i.e.: whitelist_sample.txt)
-w Allowlist file. See allowlist_sample.txt for reference and format. S3 URI is supported.
(i.e.: allowlist_sample.txt or s3://bucket/prefix/allowlist_sample.txt)
-N <shodan_api_key> Shodan API key used by check extra7102.
-o Custom output directory, if not specified will use default prowler/output, requires -M <mode>.
(i.e.: -M csv -o /tmp/reports/)
@@ -201,7 +201,7 @@ while getopts ":hlLkqp:r:c:C:g:f:m:M:E:x:enbVsSI:A:R:T:w:N:o:B:D:F:zZ:O:" OPTION
SESSION_DURATION_TO_ASSUME=$OPTARG
;;
w )
WHITELIST_FILE=$OPTARG
ALLOWLIST_FILE=$OPTARG
;;
N )
SHODAN_API_KEY=$OPTARG
@@ -294,6 +294,8 @@ unset AWS_DEFAULT_OUTPUT
. $PROWLER_DIR/include/securityhub_integration
. $PROWLER_DIR/include/junit_integration
. $PROWLER_DIR/include/organizations_metadata
. $PROWLER_DIR/include/custom_checks
. $PROWLER_DIR/include/allowlist
# Parses the check file into CHECK_ID's.
if [[ -n "$CHECK_FILE" ]]; then
@@ -308,11 +310,9 @@ if [[ -n "$CHECK_FILE" ]]; then
fi
fi
# Pre-process whitelist file if supplied
if [[ -n "$WHITELIST_FILE" ]]; then
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
WHITELIST="$(awk '!/^[[:space:]]*#/{print }' <(cat "$WHITELIST_FILE") | sed 's/[[:space:]]*#.*$//g')"
# Pre-process allowlist file if supplied
if [[ -n "$ALLOWLIST_FILE" ]]; then
allowlist
fi
# Load all of the groups of checks inside groups folder named as "groupNumber*"
@@ -328,9 +328,7 @@ done
# include checks if external folder is specified
if [[ $EXTERNAL_CHECKS_PATH ]]; then
for checks in $(ls $EXTERNAL_CHECKS_PATH/check*); do
. "$checks"
done
custom_checks
fi
# Get a list of total checks available by ID
@@ -462,7 +460,7 @@ execute_check() {
# Generate the credential report, only if it is group1 related which checks we
# run so that the checks can safely assume it's available
# set the custom ignores list for this check
ignores="$(awk "/${1}/{print}" <(echo "${WHITELIST}"))"
ignores="$(awk "/${1}/{print}" <(echo "${ALLOWLIST}"))"
if [ ${alternate_name} ];then
if [[ ${alternate_name} == check1* || ${alternate_name} == extra71 || ${alternate_name} == extra774 || ${alternate_name} == extra7123 ]];then