Prowler 2.7.0 - Brave (#998)

* Extra7161 EFS encryption at rest check

* Added check_extra7162 which checks if Log groups have 365 days retention

* fixed code to handle all regions and formatted output

* changed check title, resource type and service name as well as making the code more dynamic

* Extra7161 EFS encryption at rest check

* New check_extra7163 Secrets Manager key rotation enabled

* New check7160 Enabled AutomaticVersionUpgrade on RedShift Cluster

* Update ProwlerRole.yaml to have same permissions as util/org-multi-account/ProwlerRole.yaml

* Fix link to quicksight dashboard

* Install detect-secrets (e.g. for check_extra742)

* Updating check_extra7163 with requested changes

* fix(assumed-role): Check if -T and -A options are set

* docs(Readme): `-T` option is not mandatory

* fix(assume-role): Handle AWS STS CLI errors

* fix(assume-role): Handle AWS STS CLI errors

* Update group25_FTR

When trying to run the group 25 (Amazon FTR related security checks) nothing happens, after looking at the code there is a misconfiguration in 2 params: GROUP_RUN_BY_DEFAULT[9] and GROUP_CHECKS[9]. Updating values to 25 fixed the issue.

* Update README.md

broken link for capital letters in group file (group25_FTR)

* #938 issue assume_role multiple times should be fixed

* Label 2.7.0-1December2021 for tests

* Fixed error that appeared if the number of findings was very high.

* Adjusted the batch to only do 50 at a time. 100 caused capacity issues. Also added a check for an edge case where if the updated findings was a multiple of the batch size, it would throw an error for attempting to import 0 findings.

* Added line to delete the temp folder after everything is done.

* New check 7164 Check if Cloudwatch log groups are protected by AWS KMS@maisenhe

* updated CHECK_RISK

* Added checks extra7160,extra7161,extra7162,extra7163 to group Extras

* Added checks extra7160,extra7161,extra7162,extra7163 to group Extras

* Added issue templates

* New check 7165 DynamoDB: DAX encrypted at rest @Daniel-Peladeau

* New check 7165 DynamoDB: DAX encrypted at rest @Daniel-Peladeau

* Fix #963 check 792 to force json in ELB queries

* Fix #957 check 763 had us-east-1 region hardcoded

* Fix #962 check 7147 ALTERNATE NAME

* Fix #940 handling error when can not list functions

* Added new checks 7164 and 7165 to group extras

* Added invalid check or group id to the error message #962

* Fix Broken Link

* Add docker volume example to README.md

* Updated Dockerfile to use amazonlinux container

* Updated Dockerfile with AWS cli v2

* Added upgrade to the RUN

* Added cache purge to Dockerfile

* Backup AWS Credentials before AssumeRole and Restore them before CopyToS3

* exporting the ENV variables

* fixed bracket

* Improved documentation for install process

* fix checks with comma issues

* Added -D option to copy to S3 with the initial AWS credentials

* Cosmetic variable name change

* Added $PROFILE_OPT to CopyToS3 commands

* remove commas

* removed file as it is not needed

* Improved help usage options -h

* Fixed CIS LEVEL on 7163 through 7165

* When performing a restoreInitialAWSCredentials, unset the credentials ENV variables if they were never set

* New check 7166 Elastic IP addresses with associations are protected by AWS Shield Advanced

* New check 7167 Cloudfront distributions are protected by AWS Shield Advanced

* New check 7168 Route53 hosted zones are protected by AWS Shield Advanced

* New check 7169 Global accelerators are protected by AWS Shield Advanced

* New check 7170 Application load balancers are protected by AWS Shield Advanced

* New check 7171 Classic load balancers are protected by AWS Shield Advanced

* Include example for global resources

* Add AWS Advance Shield protection checks corrections

* Added Shield actions GetSubscriptionState and DescribeProtection

* Added Shield actions GetSubscriptionState and DescribeProtection

* docs(templates): Improve bug template with more info (#982)

* Removed echoes after role chaining fix

* Changed Route53 checks7152 and 7153 to INFO when no domains found

* Changed Route53 checks 7152 and 7153 title to clarify

* Added passed security groups in output to check 778

* Added passed security groups and updated title to check 777

* Added FAIL as error handling when SCP prevents queries to regions

* Label version 2.7.0-6January2022

* Updated .dockerignore with .github/

* Fix: issue #758 and #984

* Fix: issue #741 CloudFront and real-time logs

* Fix issues #971 set all as INFO instead of FAIL when no access to resource

* Fix: issue #986

* Add additional action permissions for Glue and Shield Advanced checks @lazize

* Add extra shield action permission

Allows the shield:GetSubscriptionState action

* Add permission actions

Make sure all files where permission actions are necessary will have the same actions

* Fix: Credential chaining from environment variables @lazize #996f

If profile is not defined, restore original credentials from environment variables,
if they exists, before assume-role

* Lable version 2.7.0-24January2022

Co-authored-by: Lee Myers <ichilegend@gmail.com>
Co-authored-by: Chinedu Obiakara <obiakac@amazon.com>
Co-authored-by: Daniel Peladeau <dcpeladeau@gmail.com>
Co-authored-by: Jonathan Lozano <jonloza@amazon.com>
Co-authored-by: Daniel Lorch <dlorch@gmail.com>
Co-authored-by: Pepe Fagoaga <jose.fagoaga@smartprotection.com>
Co-authored-by: Israel <6672089+lopmoris@users.noreply.github.com>
Co-authored-by: root <halfluke@gmail.com>
Co-authored-by: nikirby <nikirby@amazon.com>
Co-authored-by: Joel Maisenhelder <maisenhe@gmail.com>
Co-authored-by: RT <35173068+rtcms@users.noreply.github.com>
Co-authored-by: Andrea Di Fabio <39841198+sectoramen@users.noreply.github.com>
Co-authored-by: Joseph de CLERCK <clerckj@amazon.fr>
Co-authored-by: Michael Dickinson <45626543+michael-dickinson-sainsburys@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
Co-authored-by: Leonardo Azize Martins <lazize@users.noreply.github.com>
This commit is contained in:
Toni de la Fuente
2022-01-24 13:49:47 +01:00
committed by GitHub
parent 42e54c42cf
commit 2b2814723f
186 changed files with 1746 additions and 394 deletions

View File

@@ -42,29 +42,48 @@ checkSecurityHubCompatibility(){
resolveSecurityHubPreviousFails(){
# Move previous check findings RecordState to ARCHIVED (as prowler didn't re-detect them)
SH_TEMP_FOLDER="$PROWLER_DIR/SH-$ACCOUNT_NUM"
if [[ ! -d $SH_TEMP_FOLDER ]]; then
# this folder is deleted once the security hub update is completed
mkdir "$SH_TEMP_FOLDER"
fi
for regx in $REGIONS; do
REGION_FOLDER="$SH_TEMP_FOLDER/$regx"
if [[ ! -d $REGION_FOLDER ]]; then
mkdir "$REGION_FOLDER"
fi
local check="$1"
NEW_TIMESTAMP=$(get_iso8601_timestamp)
FILTER="{\"GeneratorId\":[{\"Value\": \"prowler-$check\",\"Comparison\":\"EQUALS\"}],\"RecordState\":[{\"Value\": \"ACTIVE\",\"Comparison\":\"EQUALS\"}],\"AwsAccountId\":[{\"Value\": \"$ACCOUNT_NUM\",\"Comparison\":\"EQUALS\"}]}"
NEW_FINDING_IDS=$(echo -n "${SECURITYHUB_NEW_FINDINGS_IDS[@]}" | jq -cRs 'split(" ")')
SECURITY_HUB_PREVIOUS_FINDINGS=$($AWSCLI securityhub --region "$regx" $PROFILE_OPT get-findings --filters "${FILTER}" | jq -c --argjson ids "$NEW_FINDING_IDS" --arg updated_at $NEW_TIMESTAMP '[ .Findings[] | select( .Id| first(select($ids[] == .)) // false | not) | .RecordState = "ARCHIVED" | .UpdatedAt = $updated_at ]')
NEW_FINDING_FILE="$REGION_FOLDER/findings.json"
NEW_FINDING_IDS=$(echo -n "${SECURITYHUB_NEW_FINDINGS_IDS[@]}" | jq -cRs 'split(" ")' > $NEW_FINDING_FILE)
EXISTING_FILE="$REGION_FOLDER/existing.json"
EXISTING_FINDINGS=$($AWSCLI securityhub --region "$regx" $PROFILE_OPT get-findings --filters "${FILTER}" > $EXISTING_FILE)
SECURITY_HUB_PREVIOUS_FINDINGS=$(for id in $(comm -23 <(jq '[.Findings[].Id] | sort | .[]' $EXISTING_FILE) <(jq '[.[]] | sort | .[]' $NEW_FINDING_FILE));
do
jq --arg updated_at $NEW_TIMESTAMP '.Findings[] | select(.Id == '"$id"') | .RecordState = "ARCHIVED" | .UpdatedAt = $updated_at ' < $EXISTING_FILE
done | jq -s '.')
if [[ $SECURITY_HUB_PREVIOUS_FINDINGS != "[]" ]]; then
FINDINGS_COUNT=$(echo $SECURITY_HUB_PREVIOUS_FINDINGS | jq '. | length')
for i in `seq 0 100 $FINDINGS_COUNT`;
for i in $(seq 0 50 $FINDINGS_COUNT);
do
BATCH_FINDINGS=$(echo $SECURITY_HUB_PREVIOUS_FINDINGS | jq -c '.['"$i:$i+100"']')
BATCH_IMPORT_RESULT=$($AWSCLI securityhub --region "$regx" $PROFILE_OPT batch-import-findings --findings "${BATCH_FINDINGS}")
if [[ -z "${BATCH_IMPORT_RESULT}" ]] || jq -e '.FailedCount >= 1' <<< "${BATCH_IMPORT_RESULT}" > /dev/null 2>&1; then
echo -e "\n$RED ERROR!$NORMAL Failed to send check output to AWS Security Hub\n"
BATCH_FINDINGS=$(echo $SECURITY_HUB_PREVIOUS_FINDINGS | jq -c '.['"$i:$i+50"']')
BATCH_FINDINGS_COUNT=$(echo $BATCH_FINDINGS | jq '. | length')
if [ "$BATCH_FINDINGS_COUNT" -gt 0 ]; then
BATCH_IMPORT_RESULT=$($AWSCLI securityhub --region "$regx" $PROFILE_OPT batch-import-findings --findings "${BATCH_FINDINGS}")
if [[ -z "${BATCH_IMPORT_RESULT}" ]] || jq -e '.FailedCount >= 1' <<< "${BATCH_IMPORT_RESULT}" > /dev/null 2>&1; then
echo -e "\n$RED ERROR!$NORMAL Failed to send check output to AWS Security Hub\n"
fi
fi
done
fi
done
rm -rf "$SH_TEMP_FOLDER"
}
sendToSecurityHub(){