mirror of
https://github.com/ghndrx/prowler.git
synced 2026-02-10 06:45:08 +00:00
feat(gcp): add Google Cloud provider with 43 checks (#2125)
This commit is contained in:
24
README.md
24
README.md
@@ -116,6 +116,22 @@ Those credentials must be associated to a user or role with proper permissions t
|
||||
|
||||
> If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
|
||||
## Google Cloud Platform
|
||||
|
||||
Prowler will follow the same credentials search as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order):
|
||||
|
||||
1. [GOOGLE_APPLICATION_CREDENTIALS environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC)
|
||||
2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal)
|
||||
3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa)
|
||||
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials:
|
||||
|
||||
- Viewer
|
||||
- Security Reviewer
|
||||
- Stackdriver Account Viewer
|
||||
|
||||
> `prowler` will scan the project associated with the credentials.
|
||||
|
||||
## Azure
|
||||
|
||||
Prowler for Azure supports the following authentication types:
|
||||
@@ -229,6 +245,14 @@ prowler aws --profile custom-profile -f us-east-1 eu-south-2
|
||||
```
|
||||
> By default, `prowler` will scan all AWS regions.
|
||||
|
||||
## Google Cloud Platform
|
||||
|
||||
Optionally, you can provide the location of an application credential JSON file with the following argument:
|
||||
|
||||
```console
|
||||
prowler gcp --credentials-file path
|
||||
```
|
||||
|
||||
## Azure
|
||||
|
||||
With Azure you need to specify which auth method is going to be used:
|
||||
|
||||
@@ -30,6 +30,24 @@ Those credentials must be associated to a user or role with proper permissions t
|
||||
|
||||
> If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
|
||||
|
||||
## Google Cloud
|
||||
|
||||
### GCP Authentication
|
||||
|
||||
Prowler will follow the same credentials search as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order):
|
||||
|
||||
1. [GOOGLE_APPLICATION_CREDENTIALS environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC)
|
||||
2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal)
|
||||
3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa)
|
||||
|
||||
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials:
|
||||
|
||||
- Viewer
|
||||
- Security Reviewer
|
||||
- Stackdriver Account Viewer
|
||||
|
||||
> `prowler` will scan the project associated with the credentials.
|
||||
|
||||
## Azure
|
||||
|
||||
Prowler for azure supports the following authentication types:
|
||||
|
||||
@@ -16,7 +16,7 @@ For **Prowler v2 Documentation**, please go [here](https://github.com/prowler-cl
|
||||
|
||||
## About Prowler
|
||||
|
||||
**Prowler** is an Open Source security tool to perform AWS and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness.
|
||||
**Prowler** is an Open Source security tool to perform AWS, Azure and Google Cloud security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness.
|
||||
|
||||
It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
|
||||
|
||||
@@ -40,7 +40,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
* `Python >= 3.9`
|
||||
* `Python pip >= 3.9`
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
@@ -54,7 +54,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
_Requirements_:
|
||||
|
||||
* Have `docker` installed: https://docs.docker.com/get-docker/.
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* In the command below, change `-v` to your local directory path in order to access the reports.
|
||||
|
||||
_Commands_:
|
||||
@@ -71,7 +71,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements for Ubuntu 20.04.3 LTS_:
|
||||
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* Install python 3.9 with: `sudo apt-get install python3.9`
|
||||
* Remove python 3.8 to avoid conflicts if you can: `sudo apt-get remove python3.8`
|
||||
* Make sure you have the python3 distutils package installed: `sudo apt-get install python3-distutils`
|
||||
@@ -91,7 +91,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements for Developers_:
|
||||
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`)
|
||||
|
||||
_Commands_:
|
||||
@@ -108,7 +108,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
* Latest Amazon Linux 2 should come with Python 3.9 already installed however it may need pip. Install Python pip 3.9 with: `sudo dnf install -y python3-pip`.
|
||||
* Make sure setuptools for python is already installed with: `pip3 install setuptools`
|
||||
|
||||
@@ -125,7 +125,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
_Requirements_:
|
||||
|
||||
* `Brew` installed in your Mac or Linux
|
||||
* AWS and/or Azure credentials
|
||||
* AWS, GCP and/or Azure credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
@@ -194,7 +194,7 @@ You can run Prowler from your workstation, an EC2 instance, Fargate or any other
|
||||

|
||||
## Basic Usage
|
||||
|
||||
To run Prowler, you will need to specify the provider (e.g aws or azure):
|
||||
To run Prowler, you will need to specify the provider (e.g aws, gcp or azure):
|
||||
> If no provider specified, AWS will be used for backward compatibility with most of v2 options.
|
||||
|
||||
```console
|
||||
@@ -226,6 +226,7 @@ For executing specific checks or services you can use options `-c`/`checks` or `
|
||||
```console
|
||||
prowler azure --checks storage_blob_public_access_level_is_disabled
|
||||
prowler aws --services s3 ec2
|
||||
prowler gcp --services iam compute
|
||||
```
|
||||
|
||||
Also, checks and services can be excluded with options `-e`/`--excluded-checks` or `--excluded-services`:
|
||||
@@ -233,6 +234,7 @@ Also, checks and services can be excluded with options `-e`/`--excluded-checks`
|
||||
```console
|
||||
prowler aws --excluded-checks s3_bucket_public_access
|
||||
prowler azure --excluded-services defender iam
|
||||
prowler gcp --excluded-services kms
|
||||
```
|
||||
|
||||
More options and executions methods that will save your time in [Miscelaneous](tutorials/misc.md).
|
||||
@@ -252,6 +254,14 @@ prowler aws --profile custom-profile -f us-east-1 eu-south-2
|
||||
```
|
||||
> By default, `prowler` will scan all AWS regions.
|
||||
|
||||
### Google Cloud
|
||||
|
||||
Optionally, you can provide the location of an application credential JSON file with the following argument:
|
||||
|
||||
```console
|
||||
prowler gcp --credentials-file path
|
||||
```
|
||||
|
||||
### Azure
|
||||
|
||||
With Azure you need to specify which auth method is going to be used:
|
||||
|
||||
@@ -13,7 +13,7 @@ Before sending findings to Prowler, you will need to perform next steps:
|
||||
- Using the AWS Management Console:
|
||||

|
||||
3. Allow Prowler to import its findings to AWS Security Hub by adding the policy below to the role or user running Prowler:
|
||||
- [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/iam/prowler-security-hub.json)
|
||||
- [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json)
|
||||
|
||||
Once it is enabled, it is as simple as running the command below (for all regions):
|
||||
|
||||
|
||||
360
poetry.lock
generated
360
poetry.lock
generated
@@ -334,6 +334,18 @@ urllib3 = ">=1.25.4,<1.27"
|
||||
[package.extras]
|
||||
crt = ["awscrt (==0.16.9)"]
|
||||
|
||||
[[package]]
|
||||
name = "cachetools"
|
||||
version = "5.3.0"
|
||||
description = "Extensible memoizing collections and decorators"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = "~=3.7"
|
||||
files = [
|
||||
{file = "cachetools-5.3.0-py3-none-any.whl", hash = "sha256:429e1a1e845c008ea6c85aa35d4b98b65d6a9763eeef3e37e92728a12d1de9d4"},
|
||||
{file = "cachetools-5.3.0.tar.gz", hash = "sha256:13dfddc7b8df938c21a940dfa6557ce6e94a2f1cdfa58eb90c805721d58f2c14"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2022.12.7"
|
||||
@@ -862,6 +874,108 @@ files = [
|
||||
[package.dependencies]
|
||||
gitdb = ">=4.0.1,<5"
|
||||
|
||||
[[package]]
|
||||
name = "google-api-core"
|
||||
version = "2.11.0"
|
||||
description = "Google API client core library"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-api-core-2.11.0.tar.gz", hash = "sha256:4b9bb5d5a380a0befa0573b302651b8a9a89262c1730e37bf423cec511804c22"},
|
||||
{file = "google_api_core-2.11.0-py3-none-any.whl", hash = "sha256:ce222e27b0de0d7bc63eb043b956996d6dccab14cc3b690aaea91c9cc99dc16e"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-auth = ">=2.14.1,<3.0dev"
|
||||
googleapis-common-protos = ">=1.56.2,<2.0dev"
|
||||
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
|
||||
requests = ">=2.18.0,<3.0.0dev"
|
||||
|
||||
[package.extras]
|
||||
grpc = ["grpcio (>=1.33.2,<2.0dev)", "grpcio (>=1.49.1,<2.0dev)", "grpcio-status (>=1.33.2,<2.0dev)", "grpcio-status (>=1.49.1,<2.0dev)"]
|
||||
grpcgcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"]
|
||||
grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-api-python-client"
|
||||
version = "2.81.0"
|
||||
description = "Google API Client Library for Python"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "google-api-python-client-2.81.0.tar.gz", hash = "sha256:8faab0b9b19d3797b455d33320c643253b6761fd0d3f3adb54792ab155d0795a"},
|
||||
{file = "google_api_python_client-2.81.0-py2.py3-none-any.whl", hash = "sha256:ad6700ae3a76ead8956d7f30935978cea308530e342ad8c1e26a4e40fc05c054"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-api-core = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0dev"
|
||||
google-auth = ">=1.19.0,<3.0.0dev"
|
||||
google-auth-httplib2 = ">=0.1.0"
|
||||
httplib2 = ">=0.15.0,<1dev"
|
||||
uritemplate = ">=3.0.1,<5"
|
||||
|
||||
[[package]]
|
||||
name = "google-auth"
|
||||
version = "2.16.2"
|
||||
description = "Google Authentication Library"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*"
|
||||
files = [
|
||||
{file = "google-auth-2.16.2.tar.gz", hash = "sha256:07e14f34ec288e3f33e00e2e3cc40c8942aa5d4ceac06256a28cd8e786591420"},
|
||||
{file = "google_auth-2.16.2-py2.py3-none-any.whl", hash = "sha256:2fef3cf94876d1a0e204afece58bb4d83fb57228aaa366c64045039fda6770a2"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cachetools = ">=2.0.0,<6.0"
|
||||
pyasn1-modules = ">=0.2.1"
|
||||
rsa = {version = ">=3.1.4,<5", markers = "python_version >= \"3.6\""}
|
||||
six = ">=1.9.0"
|
||||
|
||||
[package.extras]
|
||||
aiohttp = ["aiohttp (>=3.6.2,<4.0.0dev)", "requests (>=2.20.0,<3.0.0dev)"]
|
||||
enterprise-cert = ["cryptography (==36.0.2)", "pyopenssl (==22.0.0)"]
|
||||
pyopenssl = ["cryptography (>=38.0.3)", "pyopenssl (>=20.0.0)"]
|
||||
reauth = ["pyu2f (>=0.1.5)"]
|
||||
requests = ["requests (>=2.20.0,<3.0.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "google-auth-httplib2"
|
||||
version = "0.1.0"
|
||||
description = "Google Authentication Library: httplib2 transport"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "google-auth-httplib2-0.1.0.tar.gz", hash = "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac"},
|
||||
{file = "google_auth_httplib2-0.1.0-py2.py3-none-any.whl", hash = "sha256:31e49c36c6b5643b57e82617cb3e021e3e1d2df9da63af67252c02fa9c1f4a10"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
google-auth = "*"
|
||||
httplib2 = ">=0.15.0"
|
||||
six = "*"
|
||||
|
||||
[[package]]
|
||||
name = "googleapis-common-protos"
|
||||
version = "1.59.0"
|
||||
description = "Common protobufs used in Google APIs"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "googleapis-common-protos-1.59.0.tar.gz", hash = "sha256:4168fcb568a826a52f23510412da405abd93f4d23ba544bb68d943b14ba3cb44"},
|
||||
{file = "googleapis_common_protos-1.59.0-py2.py3-none-any.whl", hash = "sha256:b287dc48449d1d41af0c69f4ea26242b5ae4c3d7249a38b0984c86a4caffff1f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev"
|
||||
|
||||
[package.extras]
|
||||
grpc = ["grpcio (>=1.44.0,<2.0.0dev)"]
|
||||
|
||||
[[package]]
|
||||
name = "grapheme"
|
||||
version = "0.6.0"
|
||||
@@ -876,6 +990,21 @@ files = [
|
||||
[package.extras]
|
||||
test = ["pytest", "sphinx", "sphinx-autobuild", "twine", "wheel"]
|
||||
|
||||
[[package]]
|
||||
name = "httplib2"
|
||||
version = "0.22.0"
|
||||
description = "A comprehensive HTTP client library."
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
||||
files = [
|
||||
{file = "httplib2-0.22.0-py3-none-any.whl", hash = "sha256:14ae0a53c1ba8f3d37e9e27cf37eabb0fb9980f435ba405d546948b009dd64dc"},
|
||||
{file = "httplib2-0.22.0.tar.gz", hash = "sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0.2,<3.0.3 || >3.0.3,<4", markers = "python_version > \"3.0\""}
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.4"
|
||||
@@ -1580,6 +1709,56 @@ docs = ["sphinx (>=1.7.1)"]
|
||||
redis = ["redis"]
|
||||
tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "pytest-timeout (>=2.1.0)", "redis", "sphinx (>=6.0.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "protobuf"
|
||||
version = "4.22.1"
|
||||
description = ""
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "protobuf-4.22.1-cp310-abi3-win32.whl", hash = "sha256:85aa9acc5a777adc0c21b449dafbc40d9a0b6413ff3a4f77ef9df194be7f975b"},
|
||||
{file = "protobuf-4.22.1-cp310-abi3-win_amd64.whl", hash = "sha256:8bc971d76c03f1dd49f18115b002254f2ddb2d4b143c583bb860b796bb0d399e"},
|
||||
{file = "protobuf-4.22.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:5917412347e1da08ce2939eb5cd60650dfb1a9ab4606a415b9278a1041fb4d19"},
|
||||
{file = "protobuf-4.22.1-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:9e12e2810e7d297dbce3c129ae5e912ffd94240b050d33f9ecf023f35563b14f"},
|
||||
{file = "protobuf-4.22.1-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:953fc7904ef46900262a26374b28c2864610b60cdc8b272f864e22143f8373c4"},
|
||||
{file = "protobuf-4.22.1-cp37-cp37m-win32.whl", hash = "sha256:6e100f7bc787cd0a0ae58dbf0ab8bbf1ee7953f862b89148b6cf5436d5e9eaa1"},
|
||||
{file = "protobuf-4.22.1-cp37-cp37m-win_amd64.whl", hash = "sha256:87a6393fa634f294bf24d1cfe9fdd6bb605cbc247af81b9b10c4c0f12dfce4b3"},
|
||||
{file = "protobuf-4.22.1-cp38-cp38-win32.whl", hash = "sha256:e3fb58076bdb550e75db06ace2a8b3879d4c4f7ec9dd86e4254656118f4a78d7"},
|
||||
{file = "protobuf-4.22.1-cp38-cp38-win_amd64.whl", hash = "sha256:651113695bc2e5678b799ee5d906b5d3613f4ccfa61b12252cfceb6404558af0"},
|
||||
{file = "protobuf-4.22.1-cp39-cp39-win32.whl", hash = "sha256:67b7d19da0fda2733702c2299fd1ef6cb4b3d99f09263eacaf1aa151d9d05f02"},
|
||||
{file = "protobuf-4.22.1-cp39-cp39-win_amd64.whl", hash = "sha256:b8700792f88e59ccecfa246fa48f689d6eee6900eddd486cdae908ff706c482b"},
|
||||
{file = "protobuf-4.22.1-py3-none-any.whl", hash = "sha256:3e19dcf4adbf608924d3486ece469dd4f4f2cf7d2649900f0efcd1a84e8fd3ba"},
|
||||
{file = "protobuf-4.22.1.tar.gz", hash = "sha256:dce7a55d501c31ecf688adb2f6c3f763cf11bc0be815d1946a84d74772ab07a7"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyasn1"
|
||||
version = "0.4.8"
|
||||
description = "ASN.1 types and codecs"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"},
|
||||
{file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyasn1-modules"
|
||||
version = "0.2.8"
|
||||
description = "A collection of ASN.1-based protocols modules."
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "pyasn1-modules-0.2.8.tar.gz", hash = "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e"},
|
||||
{file = "pyasn1_modules-0.2.8-py2.py3-none-any.whl", hash = "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pyasn1 = ">=0.4.6,<0.5.0"
|
||||
|
||||
[[package]]
|
||||
name = "pycodestyle"
|
||||
version = "2.10.0"
|
||||
@@ -1954,100 +2133,72 @@ pyyaml = "*"
|
||||
|
||||
[[package]]
|
||||
name = "regex"
|
||||
version = "2022.10.31"
|
||||
version = "2023.3.22"
|
||||
description = "Alternative regular expression module, to replace re."
|
||||
category = "main"
|
||||
optional = true
|
||||
python-versions = ">=3.6"
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "regex-2022.10.31-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a8ff454ef0bb061e37df03557afda9d785c905dab15584860f982e88be73015f"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1eba476b1b242620c266edf6325b443a2e22b633217a9835a52d8da2b5c051f9"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0e5af9a9effb88535a472e19169e09ce750c3d442fb222254a276d77808620b"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d03fe67b2325cb3f09be029fd5da8df9e6974f0cde2c2ac6a79d2634e791dd57"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9d0b68ac1743964755ae2d89772c7e6fb0118acd4d0b7464eaf3921c6b49dd4"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a45b6514861916c429e6059a55cf7db74670eaed2052a648e3e4d04f070e001"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8b0886885f7323beea6f552c28bff62cbe0983b9fbb94126531693ea6c5ebb90"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:5aefb84a301327ad115e9d346c8e2760009131d9d4b4c6b213648d02e2abe144"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:702d8fc6f25bbf412ee706bd73019da5e44a8400861dfff7ff31eb5b4a1276dc"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:a3c1ebd4ed8e76e886507c9eddb1a891673686c813adf889b864a17fafcf6d66"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:50921c140561d3db2ab9f5b11c5184846cde686bb5a9dc64cae442926e86f3af"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:7db345956ecce0c99b97b042b4ca7326feeec6b75facd8390af73b18e2650ffc"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:763b64853b0a8f4f9cfb41a76a4a85a9bcda7fdda5cb057016e7706fde928e66"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-win32.whl", hash = "sha256:44136355e2f5e06bf6b23d337a75386371ba742ffa771440b85bed367c1318d1"},
|
||||
{file = "regex-2022.10.31-cp310-cp310-win_amd64.whl", hash = "sha256:bfff48c7bd23c6e2aec6454aaf6edc44444b229e94743b34bdcdda2e35126cf5"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4b4b1fe58cd102d75ef0552cf17242705ce0759f9695334a56644ad2d83903fe"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:542e3e306d1669b25936b64917285cdffcd4f5c6f0247636fec037187bd93542"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c27cc1e4b197092e50ddbf0118c788d9977f3f8f35bfbbd3e76c1846a3443df7"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b8e38472739028e5f2c3a4aded0ab7eadc447f0d84f310c7a8bb697ec417229e"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:76c598ca73ec73a2f568e2a72ba46c3b6c8690ad9a07092b18e48ceb936e9f0c"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c28d3309ebd6d6b2cf82969b5179bed5fefe6142c70f354ece94324fa11bf6a1"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9af69f6746120998cd9c355e9c3c6aec7dff70d47247188feb4f829502be8ab4"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a5f9505efd574d1e5b4a76ac9dd92a12acb2b309551e9aa874c13c11caefbe4f"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:5ff525698de226c0ca743bfa71fc6b378cda2ddcf0d22d7c37b1cc925c9650a5"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:4fe7fda2fe7c8890d454f2cbc91d6c01baf206fbc96d89a80241a02985118c0c"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:2cdc55ca07b4e70dda898d2ab7150ecf17c990076d3acd7a5f3b25cb23a69f1c"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:44a6c2f6374e0033873e9ed577a54a3602b4f609867794c1a3ebba65e4c93ee7"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-win32.whl", hash = "sha256:d8716f82502997b3d0895d1c64c3b834181b1eaca28f3f6336a71777e437c2af"},
|
||||
{file = "regex-2022.10.31-cp311-cp311-win_amd64.whl", hash = "sha256:61edbca89aa3f5ef7ecac8c23d975fe7261c12665f1d90a6b1af527bba86ce61"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:0a069c8483466806ab94ea9068c34b200b8bfc66b6762f45a831c4baaa9e8cdd"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d26166acf62f731f50bdd885b04b38828436d74e8e362bfcb8df221d868b5d9b"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac741bf78b9bb432e2d314439275235f41656e189856b11fb4e774d9f7246d81"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75f591b2055523fc02a4bbe598aa867df9e953255f0b7f7715d2a36a9c30065c"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b30bddd61d2a3261f025ad0f9ee2586988c6a00c780a2fb0a92cea2aa702c54"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef4163770525257876f10e8ece1cf25b71468316f61451ded1a6f44273eedeb5"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:7b280948d00bd3973c1998f92e22aa3ecb76682e3a4255f33e1020bd32adf443"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:d0213671691e341f6849bf33cd9fad21f7b1cb88b89e024f33370733fec58742"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:22e7ebc231d28393dfdc19b185d97e14a0f178bedd78e85aad660e93b646604e"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:8ad241da7fac963d7573cc67a064c57c58766b62a9a20c452ca1f21050868dfa"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:586b36ebda81e6c1a9c5a5d0bfdc236399ba6595e1397842fd4a45648c30f35e"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:0653d012b3bf45f194e5e6a41df9258811ac8fc395579fa82958a8b76286bea4"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-win32.whl", hash = "sha256:144486e029793a733e43b2e37df16a16df4ceb62102636ff3db6033994711066"},
|
||||
{file = "regex-2022.10.31-cp36-cp36m-win_amd64.whl", hash = "sha256:c14b63c9d7bab795d17392c7c1f9aaabbffd4cf4387725a0ac69109fb3b550c6"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4cac3405d8dda8bc6ed499557625585544dd5cbf32072dcc72b5a176cb1271c8"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23cbb932cc53a86ebde0fb72e7e645f9a5eec1a5af7aa9ce333e46286caef783"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74bcab50a13960f2a610cdcd066e25f1fd59e23b69637c92ad470784a51b1347"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78d680ef3e4d405f36f0d6d1ea54e740366f061645930072d39bca16a10d8c93"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce6910b56b700bea7be82c54ddf2e0ed792a577dfaa4a76b9af07d550af435c6"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:659175b2144d199560d99a8d13b2228b85e6019b6e09e556209dfb8c37b78a11"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1ddf14031a3882f684b8642cb74eea3af93a2be68893901b2b387c5fd92a03ec"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b683e5fd7f74fb66e89a1ed16076dbab3f8e9f34c18b1979ded614fe10cdc4d9"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2bde29cc44fa81c0a0c8686992c3080b37c488df167a371500b2a43ce9f026d1"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:4919899577ba37f505aaebdf6e7dc812d55e8f097331312db7f1aab18767cce8"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:9c94f7cc91ab16b36ba5ce476f1904c91d6c92441f01cd61a8e2729442d6fcf5"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ae1e96785696b543394a4e3f15f3f225d44f3c55dafe3f206493031419fedf95"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-win32.whl", hash = "sha256:c670f4773f2f6f1957ff8a3962c7dd12e4be54d05839b216cb7fd70b5a1df394"},
|
||||
{file = "regex-2022.10.31-cp37-cp37m-win_amd64.whl", hash = "sha256:8e0caeff18b96ea90fc0eb6e3bdb2b10ab5b01a95128dfeccb64a7238decf5f0"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:131d4be09bea7ce2577f9623e415cab287a3c8e0624f778c1d955ec7c281bd4d"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e613a98ead2005c4ce037c7b061f2409a1a4e45099edb0ef3200ee26ed2a69a8"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:052b670fafbe30966bbe5d025e90b2a491f85dfe5b2583a163b5e60a85a321ad"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aa62a07ac93b7cb6b7d0389d8ef57ffc321d78f60c037b19dfa78d6b17c928ee"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5352bea8a8f84b89d45ccc503f390a6be77917932b1c98c4cdc3565137acc714"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20f61c9944f0be2dc2b75689ba409938c14876c19d02f7585af4460b6a21403e"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:29c04741b9ae13d1e94cf93fca257730b97ce6ea64cfe1eba11cf9ac4e85afb6"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:543883e3496c8b6d58bd036c99486c3c8387c2fc01f7a342b760c1ea3158a318"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b7a8b43ee64ca8f4befa2bea4083f7c52c92864d8518244bfa6e88c751fa8fff"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:6a9a19bea8495bb419dc5d38c4519567781cd8d571c72efc6aa959473d10221a"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:6ffd55b5aedc6f25fd8d9f905c9376ca44fcf768673ffb9d160dd6f409bfda73"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:4bdd56ee719a8f751cf5a593476a441c4e56c9b64dc1f0f30902858c4ef8771d"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8ca88da1bd78990b536c4a7765f719803eb4f8f9971cc22d6ca965c10a7f2c4c"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-win32.whl", hash = "sha256:5a260758454580f11dd8743fa98319bb046037dfab4f7828008909d0aa5292bc"},
|
||||
{file = "regex-2022.10.31-cp38-cp38-win_amd64.whl", hash = "sha256:5e6a5567078b3eaed93558842346c9d678e116ab0135e22eb72db8325e90b453"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5217c25229b6a85049416a5c1e6451e9060a1edcf988641e309dbe3ab26d3e49"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4bf41b8b0a80708f7e0384519795e80dcb44d7199a35d52c15cc674d10b3081b"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cf0da36a212978be2c2e2e2d04bdff46f850108fccc1851332bcae51c8907cc"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d403d781b0e06d2922435ce3b8d2376579f0c217ae491e273bab8d092727d244"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a37d51fa9a00d265cf73f3de3930fa9c41548177ba4f0faf76e61d512c774690"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4f781ffedd17b0b834c8731b75cce2639d5a8afe961c1e58ee7f1f20b3af185"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d243b36fbf3d73c25e48014961e83c19c9cc92530516ce3c43050ea6276a2ab7"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:370f6e97d02bf2dd20d7468ce4f38e173a124e769762d00beadec3bc2f4b3bc4"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:597f899f4ed42a38df7b0e46714880fb4e19a25c2f66e5c908805466721760f5"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7dbdce0c534bbf52274b94768b3498abdf675a691fec5f751b6057b3030f34c1"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:22960019a842777a9fa5134c2364efaed5fbf9610ddc5c904bd3a400973b0eb8"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:7f5a3ffc731494f1a57bd91c47dc483a1e10048131ffb52d901bfe2beb6102e8"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7ef6b5942e6bfc5706301a18a62300c60db9af7f6368042227ccb7eeb22d0892"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-win32.whl", hash = "sha256:395161bbdbd04a8333b9ff9763a05e9ceb4fe210e3c7690f5e68cedd3d65d8e1"},
|
||||
{file = "regex-2022.10.31-cp39-cp39-win_amd64.whl", hash = "sha256:957403a978e10fb3ca42572a23e6f7badff39aa1ce2f4ade68ee452dc6807692"},
|
||||
{file = "regex-2022.10.31.tar.gz", hash = "sha256:a3a98921da9a1bf8457aeee6a551948a83601689e5ecdd736894ea9bbec77e83"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:68e9add923bda8357e6fe65a568766feae369063cb7210297067675cce65272f"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b280cb303fed94199f0b976595af71ebdcd388fb5e377a8198790f1016a23476"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:328a70e578f37f59eb54e8450b5042190bbadf2ef7f5c0b60829574b62955ed7"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c00c357a4914f58398503c7f716cf1646b1e36b8176efa35255f5ebfacedfa46"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d40cecf4bcb2cb37c59e3c79e5bbc45d47e3f3e07edf24e35fc5775db2570058"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:43469c22fcf705a7cb59c7e01d6d96975bdbc54c1138900f04d11496489a0054"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d4d3571c8eb21f0fbe9f0b21b49092c24d442f9a295f079949df3551b2886f29"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:148ad520f41021b97870e9c80420e6cdaadcc5e4306e613aed84cd5d53f8a7ca"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:24242e5f26823e95edd64969bd206d4752c1a56a744d8cbcf58461f9788bc0c7"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:60fcef5c3144d861b623456d87ca7fff7af59a4a918e1364cdd0687b48285285"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:533ba64d67d882286557106a1c5f12b4c2825f11b47a7c209a8c22922ca882be"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:80a288b21b17e39fb3630cf1d14fd704499bb11d9c8fc110662a0c57758d3d3e"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fa41a427d4f03ec6d6da2fd8a230f4f388f336cd7ca46b46c4d2a1bca3ead85a"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-win32.whl", hash = "sha256:3c4fa90fd91cc2957e66195ce374331bebbc816964864f64b42bd14bda773b53"},
|
||||
{file = "regex-2023.3.22-cp310-cp310-win_amd64.whl", hash = "sha256:a4c7b8c5a3a186b49415af3be18e4b8f93b33d6853216c0a1d7401736b703bce"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0a2a851d0548a4e298d88e3ceeb4bad4aab751cf1883edf6150f25718ce0207a"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f2bc8a9076ea7add860d57dbee0554a212962ecf2a900344f2fc7c56a02463b0"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e30d9a6fd7a7a6a4da6f80d167ce8eda4a993ff24282cbc73f34186c46a498db"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3371975b165c1e859e1990e5069e8606f00b25aed961cfd25b7bac626b1eb5a9"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33c887b658afb144cdc8ce9156a0e1098453060c18b8bd5177f831ad58e0d60d"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd47362e03acc780aad5a5bc4624d495594261b55a1f79a5b775b6be865a5911"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7798b3d662f70cea425637c54da30ef1894d426cab24ee7ffaaccb24a8b17bb8"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bdab2c90665b88faf5cc5e11bf835d548f4b8d8060c89fc70782b6020850aa1c"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:55f907c4d18a5a40da0ceb339a0beda77c9df47c934adad987793632fb4318c3"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e00b046000b313ffaa2f6e8d7290b33b08d2005150eff4c8cf3ad74d011888d1"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:20ce96da2093e72e151d6af8217a629aeb5f48f1ac543c2fffd1d87c57699d7e"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8527ea0978ed6dc58ccb3935bd2883537b455c97ec44b5d8084677dfa817f96b"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-win32.whl", hash = "sha256:4c9c3db90acd17e4231344a23616f33fd79837809584ce30e2450ca312fa47aa"},
|
||||
{file = "regex-2023.3.22-cp311-cp311-win_amd64.whl", hash = "sha256:e1b56dac5e86ab52e0443d63b02796357202a8f8c5966b69f8d4c03a94778e98"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:33bab9c9af936123b70b9874ce83f2bcd54be76b97637b33d31560fba8ad5d78"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b59233cb8df6b60fff5f3056f6f342a8f5f04107a11936bf49ebff87dd4ace34"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f6f29cb134d782685f8eda01d72073c483c7f87b318b5101c7001faef7850f5"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d15a0cc48f7a3055e89df1bd6623a907c407d1f58f67ff47064e598d4a550de4"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:159c7b83488a056365119ada0bceddc06a455d3db7a7aa3cf07f13b2878b885f"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aff7c778d9229d66f716ad98a701fa91cf97935ae4a32a145ae9e61619906aaa"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3e66cfc915f5f7e2c8a0af8a27f87aa857f440de7521fd7f2682e23f082142a1"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3b4da28d89527572f0d4a24814e353e1228a7aeda965e5d9265c1435a154b17a"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5da83c964aecb6c3f2a6c9a03f3d0fa579e1ad208e2c264ba826cecd19da11fa"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:81291006a934052161eae8340e7731ea6b8595b0c27dd4927c4e8a489e1760e2"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:c95a977cfdccb8ddef95ddd77cf586fe9dc327c7c93cf712983cece70cdaa1be"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:cdd3d2df486c9a8c6d08f78bdfa8ea7cf6191e037fde38c2cf6f5f0559e9d353"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f311ca33fcb9f8fb060c1fa76238d8d029f33b71a2021bafa5d423cc25965b54"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-win32.whl", hash = "sha256:2e2e6baf4a1108f84966f44870b26766d8f6d104c9959aae329078327c677122"},
|
||||
{file = "regex-2023.3.22-cp38-cp38-win_amd64.whl", hash = "sha256:60b545806a433cc752b9fa936f1c0a63bf96a3872965b958b35bd0d5d788d411"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5826e7fb443acb49f64f9648a2852efc8d9af2f4c67f6c3dca69dccd9e8e1d15"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:59b3aab231c27cd754d6452c43b12498d34e7ab87d69a502bd0220f4b1c090c4"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:97326d62255203c6026896d4b1ad6b5a0141ba097cae00ed3a508fe454e96baf"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59a15c2803c20702d7f2077807d9a2b7d9a168034b87fd3f0d8361de60019a1e"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ad467524cb6879ce42107cf02a49cdb4a06f07fe0e5f1160d7db865a8d25d4b"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:617d101b95151d827d5366e9c4225a68c64d56065e41ab9c7ef51bb87f347a8a"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:548257463696daf919d2fdfc53ee4b98e29e3ffc5afddd713d83aa849d1fa178"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1937946dd03818845bd9c1713dfd3173a7b9a324e6593a235fc8c51c9cd460eb"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d94a0d25e517c76c9ce9e2e2635d9d1a644b894f466a66a10061f4e599cdc019"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:87016850c13082747bd120558e6750746177bd492b103b2fca761c8a1c43fba9"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:3582db55372eaee9e998d378109c4b9b15beb2c84624c767efe351363fada9c4"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:88552925fd22320600c59ee80342d6eb06bfa9503c3a402d7327983f5fa999d9"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8d7477ebaf5d3621c763702e1ec0daeede8863fb22459c5e26ddfd17e9b1999c"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-win32.whl", hash = "sha256:dcc5b0d6a94637c071a427dc4469efd0ae4fda8ff384790bc8b5baaf9308dc3e"},
|
||||
{file = "regex-2023.3.22-cp39-cp39-win_amd64.whl", hash = "sha256:f1977c1fe28173f2349d42c59f80f10a97ce34f2bedb7b7f55e2e8a8de9b7dfb"},
|
||||
{file = "regex-2023.3.22.tar.gz", hash = "sha256:f579a202b90c1110d0894a86b32a89bf550fdb34bdd3f9f550115706be462e19"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2146,6 +2297,21 @@ pygments = ">=2.13.0,<3.0.0"
|
||||
[package.extras]
|
||||
jupyter = ["ipywidgets (>=7.5.1,<9)"]
|
||||
|
||||
[[package]]
|
||||
name = "rsa"
|
||||
version = "4.9"
|
||||
description = "Pure-Python RSA implementation"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.6,<4"
|
||||
files = [
|
||||
{file = "rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7"},
|
||||
{file = "rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pyasn1 = ">=0.1.3"
|
||||
|
||||
[[package]]
|
||||
name = "ruamel-yaml"
|
||||
version = "0.17.21"
|
||||
@@ -2432,6 +2598,18 @@ files = [
|
||||
{file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "uritemplate"
|
||||
version = "4.1.1"
|
||||
description = "Implementation of RFC 6570 URI Templates"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
files = [
|
||||
{file = "uritemplate-4.1.1-py2.py3-none-any.whl", hash = "sha256:830c08b8d99bdd312ea4ead05994a38e8936266f84b9a7878232db50b044e02e"},
|
||||
{file = "uritemplate-4.1.1.tar.gz", hash = "sha256:4346edfc5c3b79f694bccd6d6099a322bbeb628dbf2cd86eea55a456ce5124f0"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "urllib3"
|
||||
version = "1.26.15"
|
||||
@@ -2670,4 +2848,4 @@ docs = ["mkdocs", "mkdocs-material"]
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.9"
|
||||
content-hash = "ab9263db7c7c836f9192444bcf680a4183d16ae49cff7d60082e61a80973001d"
|
||||
content-hash = "245d449b4b8d1cbe6be0a125761312a6a53bf1f050440a41ffe8034945064fb8"
|
||||
|
||||
0
prowler/compliance/gcp/__init__.py
Normal file
0
prowler/compliance/gcp/__init__.py
Normal file
@@ -33,6 +33,8 @@ with os.scandir(compliance_aws_dir) as files:
|
||||
# AWS services-regions matrix json
|
||||
aws_services_json_file = "aws_regions_by_service.json"
|
||||
|
||||
# gcp_zones_json_file = "gcp_zones.json"
|
||||
|
||||
default_output_directory = getcwd() + "/output"
|
||||
|
||||
output_file_timestamp = timestamp.strftime("%Y%m%d%H%M%S")
|
||||
|
||||
@@ -128,6 +128,23 @@ class Check_Report_Azure(Check_Report):
|
||||
self.subscription = ""
|
||||
|
||||
|
||||
@dataclass
|
||||
class Check_Report_GCP(Check_Report):
|
||||
"""Contains the GCP Check's finding information."""
|
||||
|
||||
resource_name: str
|
||||
resource_id: str
|
||||
project_id: str
|
||||
location: str
|
||||
|
||||
def __init__(self, metadata):
|
||||
super().__init__(metadata)
|
||||
self.resource_name = ""
|
||||
self.resource_id = ""
|
||||
self.project_id = ""
|
||||
self.location = ""
|
||||
|
||||
|
||||
# Testing Pending
|
||||
def load_check_metadata(metadata_file: str) -> Check_Metadata_Model:
|
||||
"""load_check_metadata loads and parse a Check's metadata file"""
|
||||
|
||||
@@ -58,6 +58,7 @@ Detailed documentation at https://docs.prowler.cloud
|
||||
# Init Providers Arguments
|
||||
self.__init_aws_parser__()
|
||||
self.__init_azure_parser__()
|
||||
self.__init_gcp_parser__()
|
||||
|
||||
def parse(self, args=None) -> argparse.Namespace:
|
||||
"""
|
||||
@@ -431,3 +432,18 @@ Detailed documentation at https://docs.prowler.cloud
|
||||
default=[],
|
||||
help="Azure subscription ids to be scanned by prowler",
|
||||
)
|
||||
|
||||
def __init_gcp_parser__(self):
|
||||
"""Init the GCP Provider CLI parser"""
|
||||
gcp_parser = self.subparsers.add_parser(
|
||||
"gcp", parents=[self.common_providers_parser], help="GCP Provider"
|
||||
)
|
||||
# Authentication Modes
|
||||
gcp_auth_subparser = gcp_parser.add_argument_group("Authentication Modes")
|
||||
gcp_auth_modes_group = gcp_auth_subparser.add_mutually_exclusive_group()
|
||||
gcp_auth_modes_group.add_argument(
|
||||
"--credentials-file",
|
||||
nargs="?",
|
||||
metavar="FILE_PATH",
|
||||
help="Authenticate using a Google Service Account Application Credentials JSON file",
|
||||
)
|
||||
|
||||
@@ -16,11 +16,13 @@ from prowler.lib.outputs.models import (
|
||||
Check_Output_CSV_CIS,
|
||||
Check_Output_CSV_ENS_RD2022,
|
||||
Check_Output_CSV_Generic_Compliance,
|
||||
Gcp_Check_Output_CSV,
|
||||
generate_csv_fields,
|
||||
)
|
||||
from prowler.lib.utils.utils import file_exists, open_file
|
||||
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
|
||||
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
|
||||
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
|
||||
|
||||
|
||||
def initialize_file_descriptor(
|
||||
@@ -82,6 +84,13 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
|
||||
audit_info,
|
||||
Azure_Check_Output_CSV,
|
||||
)
|
||||
if isinstance(audit_info, GCP_Audit_Info):
|
||||
file_descriptor = initialize_file_descriptor(
|
||||
filename,
|
||||
output_mode,
|
||||
audit_info,
|
||||
Gcp_Check_Output_CSV,
|
||||
)
|
||||
file_descriptors.update({output_mode: file_descriptor})
|
||||
|
||||
elif output_mode == "json":
|
||||
|
||||
@@ -56,6 +56,19 @@ def generate_provider_output_csv(
|
||||
)
|
||||
finding_output = output_model(**data)
|
||||
|
||||
if provider == "gcp":
|
||||
data["resource_id"] = finding.resource_id
|
||||
data["resource_name"] = finding.resource_name
|
||||
data["project_id"] = finding.project_id
|
||||
data["location"] = finding.location
|
||||
data[
|
||||
"finding_unique_id"
|
||||
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.project_id}-{finding.resource_id}"
|
||||
data["compliance"] = unroll_dict(
|
||||
get_check_compliance(finding, provider, output_options)
|
||||
)
|
||||
finding_output = output_model(**data)
|
||||
|
||||
if provider == "aws":
|
||||
data["profile"] = audit_info.profile
|
||||
data["account_id"] = audit_info.audited_account
|
||||
@@ -305,6 +318,17 @@ class Azure_Check_Output_CSV(Check_Output_CSV):
|
||||
resource_name: str = ""
|
||||
|
||||
|
||||
class Gcp_Check_Output_CSV(Check_Output_CSV):
|
||||
"""
|
||||
Gcp_Check_Output_CSV generates a finding's output in CSV format for the GCP provider.
|
||||
"""
|
||||
|
||||
project_id: str = ""
|
||||
location: str = ""
|
||||
resource_id: str = ""
|
||||
resource_name: str = ""
|
||||
|
||||
|
||||
def generate_provider_output_json(
|
||||
provider: str, finding, audit_info, mode: str, output_options
|
||||
):
|
||||
@@ -333,6 +357,16 @@ def generate_provider_output_json(
|
||||
finding, provider, output_options
|
||||
)
|
||||
|
||||
if provider == "gcp":
|
||||
finding_output.ProjectId = audit_info.project_id
|
||||
finding_output.Location = finding.location
|
||||
finding_output.ResourceId = finding.resource_id
|
||||
finding_output.ResourceName = finding.resource_name
|
||||
finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.project_id}-{finding.resource_id}"
|
||||
finding_output.Compliance = get_check_compliance(
|
||||
finding, provider, output_options
|
||||
)
|
||||
|
||||
if provider == "aws":
|
||||
finding_output.Profile = audit_info.profile
|
||||
finding_output.AccountId = audit_info.audited_account
|
||||
@@ -421,6 +455,20 @@ class Azure_Check_Output_JSON(Check_Output_JSON):
|
||||
super().__init__(**metadata)
|
||||
|
||||
|
||||
class Gcp_Check_Output_JSON(Check_Output_JSON):
|
||||
"""
|
||||
Gcp_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
|
||||
"""
|
||||
|
||||
ProjectId: str = ""
|
||||
ResourceId: str = ""
|
||||
ResourceName: str = ""
|
||||
Location: str = ""
|
||||
|
||||
def __init__(self, **metadata):
|
||||
super().__init__(**metadata)
|
||||
|
||||
|
||||
class Check_Output_CSV_ENS_RD2022(BaseModel):
|
||||
"""
|
||||
Check_Output_CSV_ENS_RD2022 generates a finding's output in CSV ENS RD2022 format.
|
||||
|
||||
@@ -33,6 +33,8 @@ def stdout_report(finding, color, verbose, is_quiet):
|
||||
details = finding.region
|
||||
if finding.check_metadata.Provider == "azure":
|
||||
details = finding.check_metadata.ServiceName
|
||||
if finding.check_metadata.Provider == "gcp":
|
||||
details = finding.location
|
||||
|
||||
if verbose and not (is_quiet and finding.status != "FAIL"):
|
||||
print(
|
||||
|
||||
@@ -26,6 +26,9 @@ def display_summary_table(
|
||||
else:
|
||||
entity_type = "Tenant ID/s"
|
||||
audited_entities = " ".join(audit_info.identity.tenant_ids)
|
||||
elif provider == "gcp":
|
||||
entity_type = "Project ID"
|
||||
audited_entities = audit_info.project_id
|
||||
|
||||
if findings:
|
||||
current = {
|
||||
@@ -53,7 +56,6 @@ def display_summary_table(
|
||||
current["Service"] != finding.check_metadata.ServiceName
|
||||
and current["Service"]
|
||||
):
|
||||
|
||||
add_service_to_table(findings_table, current)
|
||||
|
||||
current["Total"] = current["Critical"] = current["High"] = current[
|
||||
|
||||
@@ -25,6 +25,9 @@ from prowler.providers.aws.lib.resource_api_tagging.resource_api_tagging import
|
||||
from prowler.providers.azure.azure_provider import Azure_Provider
|
||||
from prowler.providers.azure.lib.audit_info.audit_info import azure_audit_info
|
||||
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
|
||||
from prowler.providers.gcp.gcp_provider import GCP_Provider
|
||||
from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info
|
||||
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
|
||||
|
||||
|
||||
class Audit_Info:
|
||||
@@ -41,7 +44,7 @@ class Audit_Info:
|
||||
else:
|
||||
return caller_identity
|
||||
|
||||
def print_audit_credentials(self, audit_info: AWS_Audit_Info):
|
||||
def print_aws_credentials(self, audit_info: AWS_Audit_Info):
|
||||
# Beautify audited regions, set "all" if there is no filter region
|
||||
regions = (
|
||||
", ".join(audit_info.audited_regions)
|
||||
@@ -61,6 +64,25 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE
|
||||
# If -A is set, print Assumed Role ARN
|
||||
if audit_info.assumed_role_info.role_arn is not None:
|
||||
report += f"""Assumed Role ARN: {Fore.YELLOW}[{audit_info.assumed_role_info.role_arn}]{Style.RESET_ALL}
|
||||
"""
|
||||
print(report)
|
||||
|
||||
def print_gcp_credentials(self, audit_info: GCP_Audit_Info):
|
||||
# Beautify audited profile, set "default" if there is no profile set
|
||||
try:
|
||||
getattr(audit_info.credentials, "_service_account_email")
|
||||
profile = (
|
||||
audit_info.credentials._service_account_email
|
||||
if audit_info.credentials._service_account_email is not None
|
||||
else "default"
|
||||
)
|
||||
except AttributeError:
|
||||
profile = "default"
|
||||
|
||||
report = f"""
|
||||
This report is being generated using credentials below:
|
||||
|
||||
GCP Account: {Fore.YELLOW}[{profile}]{Style.RESET_ALL} GCP Project ID: {Fore.YELLOW}[{audit_info.project_id}]{Style.RESET_ALL}
|
||||
"""
|
||||
print(report)
|
||||
|
||||
@@ -257,7 +279,7 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE
|
||||
current_audit_info.profile_region = "us-east-1"
|
||||
|
||||
if not arguments.get("only_logs"):
|
||||
self.print_audit_credentials(current_audit_info)
|
||||
self.print_aws_credentials(current_audit_info)
|
||||
|
||||
# Parse Scan Tags
|
||||
if arguments.get("resource_tags"):
|
||||
@@ -320,6 +342,29 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE
|
||||
|
||||
return azure_audit_info
|
||||
|
||||
def set_gcp_audit_info(self, arguments) -> GCP_Audit_Info:
|
||||
"""
|
||||
set_gcp_audit_info returns the GCP_Audit_Info
|
||||
"""
|
||||
logger.info("Setting GCP session ...")
|
||||
|
||||
logger.info("Checking if any credentials mode is set ...")
|
||||
credentials_file = arguments.get("credentials_file")
|
||||
|
||||
gcp_provider = GCP_Provider(
|
||||
credentials_file,
|
||||
)
|
||||
|
||||
(
|
||||
gcp_audit_info.credentials,
|
||||
gcp_audit_info.project_id,
|
||||
) = gcp_provider.get_credentials()
|
||||
|
||||
if not arguments.get("only_logs"):
|
||||
self.print_gcp_credentials(gcp_audit_info)
|
||||
|
||||
return gcp_audit_info
|
||||
|
||||
|
||||
def set_provider_audit_info(provider: str, arguments: dict):
|
||||
"""
|
||||
|
||||
@@ -77,6 +77,27 @@ class Azure_Output_Options(Provider_Output_Options):
|
||||
arguments.output_modes.remove("html")
|
||||
|
||||
|
||||
class Gcp_Output_Options(Provider_Output_Options):
|
||||
def __init__(self, arguments, audit_info, allowlist_file, bulk_checks_metadata):
|
||||
# First call Provider_Output_Options init
|
||||
super().__init__(arguments, allowlist_file, bulk_checks_metadata)
|
||||
|
||||
# Check if custom output filename was input, if not, set the default
|
||||
if (
|
||||
not hasattr(arguments, "output_filename")
|
||||
or arguments.output_filename is None
|
||||
):
|
||||
self.output_filename = (
|
||||
f"prowler-output-{audit_info.project_id}-{output_file_timestamp}"
|
||||
)
|
||||
else:
|
||||
self.output_filename = arguments.output_filename
|
||||
|
||||
# Remove HTML Output since it is not supported yet
|
||||
if "html" in arguments.output_modes:
|
||||
arguments.output_modes.remove("html")
|
||||
|
||||
|
||||
class Aws_Output_Options(Provider_Output_Options):
|
||||
security_hub_enabled: bool
|
||||
|
||||
|
||||
0
prowler/providers/gcp/__init__.py
Normal file
0
prowler/providers/gcp/__init__.py
Normal file
51
prowler/providers/gcp/gcp_provider.py
Normal file
51
prowler/providers/gcp/gcp_provider.py
Normal file
@@ -0,0 +1,51 @@
|
||||
import os
|
||||
import sys
|
||||
|
||||
from google import auth
|
||||
from googleapiclient import discovery
|
||||
from googleapiclient.discovery import Resource
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
|
||||
|
||||
|
||||
class GCP_Provider:
|
||||
def __init__(
|
||||
self,
|
||||
credentials_file: str,
|
||||
):
|
||||
logger.info("Instantiating GCP Provider ...")
|
||||
self.credentials, self.project_id = self.__set_credentials__(credentials_file)
|
||||
|
||||
def __set_credentials__(self, credentials_file):
|
||||
try:
|
||||
if credentials_file:
|
||||
self.__set_gcp_creds_env_var__(credentials_file)
|
||||
|
||||
return auth.default()
|
||||
except Exception as error:
|
||||
logger.critical(f"{error.__class__.__name__} -- {error}")
|
||||
sys.exit(1)
|
||||
|
||||
def __set_gcp_creds_env_var__(self, credentials_file):
|
||||
logger.info(
|
||||
"GCP provider: Setting GOOGLE_APPLICATION_CREDENTIALS environment variable..."
|
||||
)
|
||||
client_secrets_path = os.path.abspath(credentials_file)
|
||||
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = client_secrets_path
|
||||
|
||||
def get_credentials(self):
|
||||
return self.credentials, self.project_id
|
||||
|
||||
|
||||
def generate_client(
|
||||
service: str,
|
||||
api_version: str,
|
||||
audit_info: GCP_Audit_Info,
|
||||
) -> Resource:
|
||||
try:
|
||||
return discovery.build(service, api_version, credentials=audit_info.credentials)
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
0
prowler/providers/gcp/lib/audit_info/__init__.py
Normal file
0
prowler/providers/gcp/lib/audit_info/__init__.py
Normal file
8
prowler/providers/gcp/lib/audit_info/audit_info.py
Normal file
8
prowler/providers/gcp/lib/audit_info/audit_info.py
Normal file
@@ -0,0 +1,8 @@
|
||||
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
|
||||
|
||||
gcp_audit_info = GCP_Audit_Info(
|
||||
credentials=None,
|
||||
project_id=None,
|
||||
audit_resources=None,
|
||||
audit_metadata=None,
|
||||
)
|
||||
18
prowler/providers/gcp/lib/audit_info/models.py
Normal file
18
prowler/providers/gcp/lib/audit_info/models.py
Normal file
@@ -0,0 +1,18 @@
|
||||
from dataclasses import dataclass
|
||||
from typing import Any, Optional
|
||||
|
||||
from google.oauth2.credentials import Credentials
|
||||
|
||||
|
||||
@dataclass
|
||||
class GCP_Audit_Info:
|
||||
credentials: Credentials
|
||||
project_id: str
|
||||
audit_resources: Optional[Any]
|
||||
audit_metadata: Optional[Any]
|
||||
|
||||
def __init__(self, credentials, project_id, audit_metadata, audit_resources):
|
||||
self.credentials = credentials
|
||||
self.project_id = project_id
|
||||
self.audit_metadata = audit_metadata
|
||||
self.audit_resources = audit_resources
|
||||
0
prowler/providers/gcp/services/bigquery/__init__.py
Normal file
0
prowler/providers/gcp/services/bigquery/__init__.py
Normal file
@@ -0,0 +1,4 @@
|
||||
from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info
|
||||
from prowler.providers.gcp.services.bigquery.bigquery_service import BigQuery
|
||||
|
||||
bigquery_client = BigQuery(gcp_audit_info)
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "bigquery_dataset_cmk_encryption",
|
||||
"CheckTitle": "Ensure BigQuery datasets are encrypted with Customer-Managed Keys (CMKs).",
|
||||
"CheckType": [],
|
||||
"ServiceName": "bigquery",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "Dataset",
|
||||
"Description": "Ensure BigQuery datasets are encrypted with Customer-Managed Keys (CMKs) in order to have a more granular control over data encryption/decryption process.",
|
||||
"Risk": "If you want to have greater control, Customer-managed encryption keys (CMEK) can be used as encryption key management solution for BigQuery Data Sets.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#cli-command",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/enable-table-encryption-with-cmks.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/ensure-gcp-big-query-tables-are-encrypted-with-customer-supplied-encryption-keys-csek-1#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Encrypting datasets with Cloud KMS Customer-Managed Keys (CMKs) will allow for a more granular control over data encryption/decryption process.",
|
||||
"Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client
|
||||
|
||||
|
||||
class bigquery_dataset_cmk_encryption(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for dataset in bigquery_client.datasets:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = bigquery_client.project_id
|
||||
report.resource_id = dataset.id
|
||||
report.resource_name = dataset.name
|
||||
report.location = dataset.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
f"Dataset {dataset.name} is encrypted with Customer-Managed Keys (CMKs)"
|
||||
)
|
||||
if not dataset.cmk_encryption:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"Dataset {dataset.name} is not encrypted with Customer-Managed Keys (CMKs)"
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "bigquery_dataset_public_access",
|
||||
"CheckTitle": "Ensure That BigQuery Datasets Are Not Anonymously or Publicly Accessible.",
|
||||
"CheckType": [],
|
||||
"ServiceName": "bigquery",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "Dataset",
|
||||
"Description": "Ensure That BigQuery Datasets Are Not Anonymously or Publicly Accessible.",
|
||||
"Risk": "Granting permissions to allUsers or allAuthenticatedUsers allows anyone to access the dataset. Such access might not be desirable if sensitive data is being stored in the dataset. Therefore, ensure that anonymous and/or public access to a dataset is not allowed.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/publicly-accessible-big-query-datasets.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_general_3#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended that the IAM policy on BigQuery datasets does not allow anonymous and/or public access.",
|
||||
"Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client
|
||||
|
||||
|
||||
class bigquery_dataset_public_access(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for dataset in bigquery_client.datasets:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = bigquery_client.project_id
|
||||
report.resource_id = dataset.id
|
||||
report.resource_name = dataset.name
|
||||
report.location = dataset.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"Dataset {dataset.name} is publicly accessible!"
|
||||
if not dataset.public:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
f"Dataset {dataset.name} is not publicly accessible"
|
||||
)
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
113
prowler/providers/gcp/services/bigquery/bigquery_service.py
Normal file
113
prowler/providers/gcp/services/bigquery/bigquery_service.py
Normal file
@@ -0,0 +1,113 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.providers.gcp.gcp_provider import generate_client
|
||||
|
||||
|
||||
################## BigQuery
|
||||
class BigQuery:
|
||||
def __init__(self, audit_info):
|
||||
self.service = "bigquery"
|
||||
self.api_version = "v2"
|
||||
self.project_id = audit_info.project_id
|
||||
self.client = generate_client(self.service, self.api_version, audit_info)
|
||||
self.datasets = []
|
||||
self.tables = []
|
||||
self.__get_datasets__()
|
||||
self.__get_tables__()
|
||||
|
||||
def __get_datasets__(self):
|
||||
try:
|
||||
request = self.client.datasets().list(projectId=self.project_id)
|
||||
while request is not None:
|
||||
response = request.execute()
|
||||
|
||||
for dataset in response.get("datasets", []):
|
||||
dataset_info = (
|
||||
self.client.datasets()
|
||||
.get(
|
||||
projectId=self.project_id,
|
||||
datasetId=dataset["datasetReference"]["datasetId"],
|
||||
)
|
||||
.execute()
|
||||
)
|
||||
cmk_encryption = False
|
||||
public = False
|
||||
roles = dataset_info.get("access", "")
|
||||
if "allAuthenticatedUsers" in str(roles) or "allUsers" in str(
|
||||
roles
|
||||
):
|
||||
public = True
|
||||
if dataset_info.get("defaultEncryptionConfiguration"):
|
||||
cmk_encryption = True
|
||||
self.datasets.append(
|
||||
Dataset(
|
||||
name=dataset["datasetReference"]["datasetId"],
|
||||
id=dataset["id"],
|
||||
region=dataset["location"],
|
||||
cmk_encryption=cmk_encryption,
|
||||
public=public,
|
||||
)
|
||||
)
|
||||
|
||||
request = self.client.datasets().list_next(
|
||||
previous_request=request, previous_response=response
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
def __get_tables__(self):
|
||||
try:
|
||||
for dataset in self.datasets:
|
||||
request = self.client.tables().list(
|
||||
projectId=self.project_id, datasetId=dataset.name
|
||||
)
|
||||
while request is not None:
|
||||
response = request.execute()
|
||||
|
||||
for table in response.get("tables", []):
|
||||
cmk_encryption = False
|
||||
if (
|
||||
self.client.tables()
|
||||
.get(
|
||||
projectId=self.project_id,
|
||||
datasetId=dataset.name,
|
||||
tableId=table["tableReference"]["tableId"],
|
||||
)
|
||||
.execute()
|
||||
.get("encryptionConfiguration")
|
||||
):
|
||||
cmk_encryption = True
|
||||
self.tables.append(
|
||||
Table(
|
||||
name=table["tableReference"]["tableId"],
|
||||
id=table["id"],
|
||||
region=dataset.region,
|
||||
cmk_encryption=cmk_encryption,
|
||||
)
|
||||
)
|
||||
|
||||
request = self.client.tables().list_next(
|
||||
previous_request=request, previous_response=response
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
|
||||
class Dataset(BaseModel):
|
||||
name: str
|
||||
id: str
|
||||
region: str
|
||||
cmk_encryption: bool
|
||||
public: bool
|
||||
|
||||
|
||||
class Table(BaseModel):
|
||||
name: str
|
||||
id: str
|
||||
region: str
|
||||
cmk_encryption: bool
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "bigquery_table_cmk_encryption",
|
||||
"CheckTitle": "Ensure BigQuery tables are encrypted with Customer-Managed Keys (CMKs).",
|
||||
"CheckType": [],
|
||||
"ServiceName": "bigquery",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "Table",
|
||||
"Description": "Ensure BigQuery tables are encrypted with Customer-Managed Keys (CMKs) in order to have a more granular control over data encryption/decryption process.",
|
||||
"Risk": "If you want to have greater control, Customer-managed encryption keys (CMEK) can be used as encryption key management solution for BigQuery Tables.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/enable-table-encryption-with-cmks.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/ensure-gcp-big-query-tables-are-encrypted-with-customer-supplied-encryption-keys-csek#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Encrypting tables with Cloud KMS Customer-Managed Keys (CMKs) will allow for a more granular control over data encryption/decryption process.",
|
||||
"Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client
|
||||
|
||||
|
||||
class bigquery_table_cmk_encryption(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for table in bigquery_client.tables:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = bigquery_client.project_id
|
||||
report.resource_id = table.id
|
||||
report.resource_name = table.name
|
||||
report.location = table.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
f"Table {table.name} is encrypted with Customer-Managed Keys (CMKs)"
|
||||
)
|
||||
if not table.cmk_encryption:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"Table {table.name} is not encrypted with Customer-Managed Keys (CMKs)"
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,6 @@
|
||||
from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info
|
||||
from prowler.providers.gcp.services.cloudresourcemanager.cloudresourcemanager_service import (
|
||||
CloudResourceManager,
|
||||
)
|
||||
|
||||
cloudresourcemanager_client = CloudResourceManager(gcp_audit_info)
|
||||
@@ -0,0 +1,41 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.providers.gcp.gcp_provider import generate_client
|
||||
|
||||
|
||||
################## CloudResourceManager
|
||||
class CloudResourceManager:
|
||||
def __init__(self, audit_info):
|
||||
self.service = "cloudresourcemanager"
|
||||
self.api_version = "v1"
|
||||
self.region = "global"
|
||||
self.project_id = audit_info.project_id
|
||||
self.client = generate_client(self.service, self.api_version, audit_info)
|
||||
self.bindings = []
|
||||
self.__get_iam_policy__()
|
||||
|
||||
def __get_client__(self):
|
||||
return self.client
|
||||
|
||||
def __get_iam_policy__(self):
|
||||
try:
|
||||
policy = (
|
||||
self.client.projects().getIamPolicy(resource=self.project_id).execute()
|
||||
)
|
||||
for binding in policy["bindings"]:
|
||||
self.bindings.append(
|
||||
Binding(
|
||||
role=binding["role"],
|
||||
members=binding["members"],
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
|
||||
class Binding(BaseModel):
|
||||
role: str
|
||||
members: list
|
||||
0
prowler/providers/gcp/services/cloudsql/__init__.py
Normal file
0
prowler/providers/gcp/services/cloudsql/__init__.py
Normal file
@@ -0,0 +1,4 @@
|
||||
from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_service import CloudSQL
|
||||
|
||||
cloudsql_client = CloudSQL(gcp_audit_info)
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_automated_backups",
|
||||
"CheckTitle": "Ensure That Cloud SQL Database Instances Are Configured With Automated Backups",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That Cloud SQL Database Instances Are Configured With Automated Backups",
|
||||
"Risk": "Backups provide a way to restore a Cloud SQL instance to recover lost data or recover from a problem with that instance. Automated backups need to be set for any instance that contains data that should be protected from loss or damage. This recommendation is applicable for SQL Server, PostgreSql, MySql generation 1 and MySql generation 2 instances.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch <INSTANCE_NAME> --backup-start-time <[HH:MM]>",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-automated-backups.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to have all SQL database instances set to enable automated backups.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/configure-ssl-instance/"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_automated_backups(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
f"Database Instance {instance.name} has automated backups configured"
|
||||
)
|
||||
if not instance.automated_backups:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"Database Instance {instance.name} does not have automated backups configured"
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_mysql_local_infile_flag",
|
||||
"CheckTitle": "Ensure That the Local_infile Database Flag for a Cloud SQL MySQL Instance Is Set to Off",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That the Local_infile Database Flag for a Cloud SQL MySQL Instance Is Set to Off",
|
||||
"Risk": "The local_infile flag controls the server-side LOCAL capability for LOAD DATA statements. Depending on the local_infile setting, the server refuses or permits local data loading by clients that have LOCAL enabled on the client side.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags local_infile=off",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-local-infile-flag.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_1#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set the local_infile database flag for a Cloud SQL MySQL instance to off.",
|
||||
"Url": "https://cloud.google.com/sql/docs/mysql/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_mysql_local_infile_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "MYSQL" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"MySQL Instance {instance.name} has not 'local_infile' flag set to 'off'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "local_infile" and flag["value"] == "off":
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"MySQL Instance {instance.name} has 'local_infile' flag set to 'off'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_mysql_skip_show_database_flag",
|
||||
"CheckTitle": "Ensure Skip_show_database Database Flag for Cloud SQL MySQL Instance Is Set to On",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure Skip_show_database Database Flag for Cloud SQL MySQL Instance Is Set to On",
|
||||
"Risk": "'skip_show_database' database flag prevents people from using the SHOW DATABASES statement if they do not have the SHOW DATABASES privilege.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags skip_show_database=on",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-skip-show-database-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set skip_show_database database flag for Cloud SQL Mysql instance to on.",
|
||||
"Url": "https://cloud.google.com/sql/docs/mysql/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_mysql_skip_show_database_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "MYSQL" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"MySQL Instance {instance.name} has not 'skip_show_database' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "skip_show_database" and flag["value"] == "on":
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"MySQL Instance {instance.name} has 'skip_show_database' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_enable_pgaudit_flag",
|
||||
"CheckTitle": "Ensure That 'cloudsql.enable_pgaudit' Database Flag for each Cloud Sql Postgresql Instance Is Set to 'on' For Centralized Logging",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That 'cloudsql.enable_pgaudit' Database Flag for each Cloud Sql Postgresql Instance Is Set to 'on' For Centralized Logging",
|
||||
"Risk": "Ensure cloudsql.enable_pgaudit database flag for Cloud SQL PostgreSQL instance is set to on to allow for centralized logging.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags cloudsql.enable_pgaudit=On",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/postgre-sql-audit-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "As numerous other recommendations in this section consist of turning on flags for logging purposes, your organization will need a way to manage these logs. You may have a solution already in place. If you do not, consider installing and enabling the open source pgaudit extension within PostgreSQL and enabling its corresponding flag of cloudsql.enable_pgaudit. This flag and installing the extension enables database auditing in PostgreSQL through the open-source pgAudit extension. This extension provides detailed session and object logging to comply with government, financial, & ISO standards and provides auditing capabilities to mitigate threats by monitoring security events on the instance. Enabling the flag and settings later in this recommendation will send these logs to Google Logs Explorer so that you can access them in a central location.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_enable_pgaudit_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'cloudsql.enable_pgaudit' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "cloudsql.enable_pgaudit"
|
||||
and flag["value"] == "on"
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'cloudsql.enable_pgaudit' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_connections_flag",
|
||||
"CheckTitle": "Ensure That the Log_connections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That the Log_connections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On",
|
||||
"Risk": "Enabling the log_connections setting causes each attempted connection to the server to be logged, along with successful completion of client authentication. This parameter cannot be changed after the session starts.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_connections=on",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-log-connections-flag.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_3#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "PostgreSQL does not log attempted connections by default. Enabling the log_connections setting will create log entries for each attempted connection as well as successful completion of client authentication which can be useful in troubleshooting issues and to determine any unusual connection attempts to the server.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_connections_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_connections' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "log_connections" and flag["value"] == "on":
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_connections' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_disconnections_flag",
|
||||
"CheckTitle": "Ensure That the log_disconnections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That the log_disconnections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On",
|
||||
"Risk": "PostgreSQL does not log session details such as duration and session end by default. Enabling the log_disconnections setting will create log entries at the end of each session which can be useful in troubleshooting issues and determine any unusual activity across a time period.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_disconnections=on",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-log-connections-flag.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_4#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Enabling the log_disconnections setting logs the end of each session, including the session duration.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_disconnections_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_disconnections' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "log_disconnections" and flag["value"] == "on":
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_disconnections' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_error_verbosity_flag",
|
||||
"CheckTitle": "Ensure Log_error_verbosity Database Flag for Cloud SQL PostgreSQL Instance Is Set to DEFAULT or Stricter",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure Log_error_verbosity Database Flag for Cloud SQL PostgreSQL Instance Is Set to DEFAULT or Stricter",
|
||||
"Risk": "The log_error_verbosity flag controls the verbosity/details of messages logged.TERSE excludes the logging of DETAIL, HINT, QUERY, and CONTEXT error information. VERBOSE output includes the SQLSTATE error code, source code file name, function name, and line number that generated the error. Ensure an appropriate value is set to 'DEFAULT' or stricter.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_error_verbosity=default",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_error_verbosity is not set to the correct value, too many details or too few details may be logged. This flag should be configured with a value of 'DEFAULT' or stricter. This recommendation is applicable to PostgreSQL database instances.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_error_verbosity_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_error_verbosity' flag set to 'default'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "log_error_verbosity"
|
||||
and flag["value"] == "default"
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_error_verbosity' flag set to 'default'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_min_duration_statement_flag",
|
||||
"CheckTitle": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set to -1",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set to -1",
|
||||
"Risk": "The log_min_duration_statement flag defines the minimum amount of execution time of a statement in milliseconds where the total duration of the statement is logged. Ensure that log_min_duration_statement is disabled, i.e., a value of -1 is set.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_duration_statement=-1",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-log-min-error-statement-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Logging SQL statements may include sensitive information that should not be recorded in logs. This recommendation is applicable to PostgreSQL database instances.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_min_duration_statement_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_duration_statement' flag set to '-1'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "log_min_duration_statement"
|
||||
and flag["value"] == "-1"
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_duration_statement' flag set to '-1'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_min_error_statement_flag",
|
||||
"CheckTitle": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"Risk": "The log_min_error_statement flag defines the minimum message severity level that are considered as an error statement. Messages for error statements are logged with the SQL statement. Valid values include DEBUG5, DEBUG4, DEBUG3, DEBUG2, DEBUG1, INFO, NOTICE, WARNING, ERROR, LOG, FATAL, and PANIC. Each severity level includes the subsequent levels mentioned above. Ensure a value of ERROR or stricter is set.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_error_statement=error",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-log-min-error-statement-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_min_error_statement is not set to the correct value, messages may not be classified as error messages appropriately. Considering general log messages as error messages would make is difficult to find actual errors and considering only stricter severity levels as error messages may skip actual errors to log their SQL statements. The log_min_error_statement flag should be set to ERROR or stricter.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,28 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_min_error_statement_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
desired_log_min_error_statement = "error"
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_error_statement' flag set minimum to '{desired_log_min_error_statement}'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "log_min_error_statement"
|
||||
and flag["value"] == desired_log_min_error_statement
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_error_statement' flag set minimum to '{desired_log_min_error_statement}'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_min_messages_flag",
|
||||
"CheckTitle": "Ensure that the Log_min_messages Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure that the Log_min_messages Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"Risk": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_min_messages is not set to the correct value, messages may not be classified as error messages appropriately. An organization will need to decide their own threshold for logging log_min_messages flag.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_messages=warning",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_4#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "The log_min_messages flag defines the minimum message severity level that is considered as an error statement. Messages for error statements are logged with the SQL statement. Valid values include DEBUG5, DEBUG4, DEBUG3, DEBUG2, DEBUG1, INFO, NOTICE, WARNING, ERROR, LOG, FATAL, and PANIC. Each severity level includes the subsequent levels mentioned above. ERROR is considered the best practice setting. Changes should only be made in accordance with the organization's logging policy.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,28 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_min_messages_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
desired_log_min_messages = "error"
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_messages' flag set minimum to '{desired_log_min_messages}'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "log_min_messages"
|
||||
and flag["value"] == desired_log_min_messages
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_messages' flag set minimum to '{desired_log_min_messages}'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_postgres_log_statement_flag",
|
||||
"CheckTitle": "Ensure That the Log_statement Database Flag for Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That the Log_statement Database Flag for Cloud SQL PostgreSQL Instance Is Set Appropriately",
|
||||
"Risk": "Auditing helps in forensic analysis. If log_statement is not set to the correct value, too many statements may be logged leading to issues in finding the relevant information from the logs, or too few statements may be logged with relevant information missing from the logs.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_statement=ddl",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "The value ddl logs all data definition statements. A value of 'ddl' is recommended unless otherwise directed by your organization's logging policy.",
|
||||
"Url": "https://cloud.google.com/sql/docs/postgres/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,28 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_postgres_log_statement_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
desired_log_statement = "ddl"
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "POSTGRES" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_statement' flag set to '{desired_log_statement}'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "log_statement"
|
||||
and flag["value"] == desired_log_statement
|
||||
):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_statement' flag set to '{desired_log_statement}'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_private_ip_assignment",
|
||||
"CheckTitle": "Ensure Instance IP assignment is set to private",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure Instance IP assignment is set to private",
|
||||
"Risk": "Instance addresses can be public IP or private IP. Public IP means that the instance is accessible through the public internet. In contrast, instances using only private IP are not accessible through the public internet, but are accessible through a Virtual Private Cloud (VPC). Limiting network access to your database will limit potential attacks.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Setting databases access only to private will reduce attack surface.",
|
||||
"Url": "https://cloud.google.com/sql/docs/mysql/configure-private-ip"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,25 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_private_ip_assignment(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"Database Instance {instance.name} does not have private IP assignments"
|
||||
for address in instance.ip_addresses:
|
||||
if address["type"] != "PRIVATE":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
f"Database Instance {instance.name} has public IP assignments"
|
||||
)
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_public_access",
|
||||
"CheckTitle": "Ensure That Cloud SQL Database Instances Do Not Implicitly Whitelist All Public IP Addresses ",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure That Cloud SQL Database Instances Do Not Implicitly Whitelist All Public IP Addresses ",
|
||||
"Risk": "To minimize attack surface on a Database server instance, only trusted/known and required IP(s) should be white-listed to connect to it. An authorized network should not have IPs/networks configured to 0.0.0.0/0 which will allow access to the instance from anywhere in the world. Note that authorized networks apply only to instances with public IPs.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch <INSTANCE_NAME> --authorized-networks=IP_ADDR1,IP_ADDR2...",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/publicly-accessible-cloud-sql-instances.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Database Server should accept connections only from trusted Network(s)/IP(s) and restrict access from public IP addresses.",
|
||||
"Url": "https://cloud.google.com/sql/docs/mysql/connection-org-policy"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,22 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_public_access(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"Database Instance {instance.name} does not whitelist all Public IP Addresses"
|
||||
for network in instance.authorized_networks:
|
||||
if network["value"] == "0.0.0.0/0":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"Database Instance {instance.name} whitelist all Public IP Addresses"
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_public_ip",
|
||||
"CheckTitle": "Check for Cloud SQL Database Instances with Public IPs",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Check for Cloud SQL Database Instances with Public IPs",
|
||||
"Risk": "To lower the organization's attack surface, Cloud SQL databases should not have public IPs. Private IPs provide improved network security and lower latency for your application.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#cli-command",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/sql-database-instances-with-public-ips.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "To lower the organization's attack surface, Cloud SQL databases should not have public IPs. Private IPs provide improved network security and lower latency for your application.",
|
||||
"Url": "https://cloud.google.com/sql/docs/mysql/configure-private-ip"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,25 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_public_ip(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
f"Database Instance {instance.name} has not a public IP"
|
||||
)
|
||||
if instance.public_ip:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
f"Database Instance {instance.name} has a public IP"
|
||||
)
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_contained_database_authentication_flag",
|
||||
"CheckTitle": "Ensure that the 'contained database authentication' database flag for Cloud SQL on the SQL Server instance is set to 'off' ",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure that the 'contained database authentication' database flag for Cloud SQL on the SQL Server instance is set to 'off' ",
|
||||
"Risk": "A contained database includes all database settings and metadata required to define the database and has no configuration dependencies on the instance of the Database Engine where the database is installed. Users can connect to the database without authenticating a login at the Database Engine level. Isolating the database from the Database Engine makes it possible to easily move the database to another instance of SQL Server. Contained databases have some unique threats that should be understood and mitigated by SQL Server Database Engine administrators. Most of the threats are related to the USER WITH PASSWORD authentication process, which moves the authentication boundary from the Database Engine level to the database level, hence this is recommended to disable this flag. This recommendation is applicable to SQL Server database instances.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch <INSTANCE_NAME> --database-flags contained database authentication=off",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-contained-database-authentication-flag.html",
|
||||
"Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_10#terraform"
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set contained database authentication database flag for Cloud SQL on the SQL Server instance to off.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_contained_database_authentication_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'contained database authentication' flag set to 'off'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "contained database authentication"
|
||||
and flag["value"] == "on"
|
||||
):
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'contained database authentication' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag",
|
||||
"CheckTitle": "Ensure that the 'cross db ownership chaining' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure that the 'cross db ownership chaining' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"Risk": "Use the cross db ownership for chaining option to configure cross-database ownership chaining for an instance of Microsoft SQL Server. This server option allows you to control cross-database ownership chaining at the database level or to allow cross- database ownership chaining for all databases. Enabling cross db ownership is not recommended unless all of the databases hosted by the instance of SQL Server must participate in cross-database ownership chaining and you are aware of the security implications of this setting.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags cross db ownership=off",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-cross-db-ownership-chaining-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set cross db ownership chaining database flag for Cloud SQL SQL Server instance to off.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'cross db ownership' flag set to 'off'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "cross db ownership" and flag["value"] == "on":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has not 'cross db ownership' flag set to 'off'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_external_scripts_enabled_flag",
|
||||
"CheckTitle": "Ensure 'external scripts enabled' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure 'external scripts enabled' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"Risk": "external scripts enabled enable the execution of scripts with certain remote language extensions. This property is OFF by default. When Advanced Analytics Services is installed, setup can optionally set this property to true. As the External Scripts Enabled feature allows scripts external to SQL such as files located in an R library to be executed, which could adversely affect the security of the system, hence this should be disabled.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags external scripts enabled=off",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-external-scripts-enabled-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set external scripts enabled database flag for Cloud SQL SQL Server instance to off",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_external_scripts_enabled_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'external scripts enabled' flag set to 'off'"
|
||||
for flag in instance.flags:
|
||||
if (
|
||||
flag["name"] == "external scripts enabled"
|
||||
and flag["value"] == "on"
|
||||
):
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has not 'external scripts enabled' flag set to 'off'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_remote_access_flag",
|
||||
"CheckTitle": "Ensure 'remote access' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "high",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure 'remote access' database flag for Cloud SQL SQL Server instance is set to 'off'",
|
||||
"Risk": "The remote access option controls the execution of stored procedures from local or remote servers on which instances of SQL Server are running. This default value for this option is 1. This grants permission to run local stored procedures from remote servers or remote stored procedures from the local server. To prevent local stored procedures from being run from a remote server or remote stored procedures from being run on the local server, this must be disabled. The Remote Access option controls the execution of local stored procedures on remote servers or remote stored procedures on local server. 'Remote access' functionality can be abused to launch a Denial-of- Service (DoS) attack on remote servers by off-loading query processing to a target, hence this should be disabled. This recommendation is applicable to SQL Server database instances.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-remote-access-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set remote access database flag for Cloud SQL SQL Server instance to off.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_remote_access_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has not 'remote access' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "remote access" and flag["value"] == "on":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'remote access' flag set to 'on'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_trace_flag",
|
||||
"CheckTitle": "Ensure '3625 (trace flag)' database flag for all Cloud SQL Server instances is set to 'on' ",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure '3625 (trace flag)' database flag for all Cloud SQL Server instances is set to 'on' ",
|
||||
"Risk": "Microsoft SQL Trace Flags are frequently used to diagnose performance issues or to debug stored procedures or complex computer systems, but they may also be recommended by Microsoft Support to address behavior that is negatively impacting a specific workload. All documented trace flags and those recommended by Microsoft Support are fully supported in a production environment when used as directed. 3625(trace log) Limits the amount of information returned to users who are not members of the sysadmin fixed server role, by masking the parameters of some error messages using '******'. Setting this in a Google Cloud flag for the instance allows for security through obscurity and prevents the disclosure of sensitive information, hence this is recommended to set this flag globally to on to prevent the flag having been left off, or changed by bad actors. This recommendation is applicable to SQL Server database instances.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch <INSTANCE_NAME> --database-flags 3625=on",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-3625-trace-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to set 3625 (trace flag) database flag for Cloud SQL SQL Server instance to on.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_trace_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has '3625 (trace flag)' flag set to 'on'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "3625" and flag["value"] == "off":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has '3625 (trace flag)' flag set to 'off'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_user_connections_flag",
|
||||
"CheckTitle": "Ensure 'user Connections' Database Flag for Cloud Sql Sql Server Instance Is Set to a Non-limiting Value",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure 'user Connections' Database Flag for Cloud Sql Sql Server Instance Is Set to a Non-limiting Value",
|
||||
"Risk": "The user connections option specifies the maximum number of simultaneous user connections that are allowed on an instance of SQL Server. The actual number of user connections allowed also depends on the version of SQL Server that you are using, and also the limits of your application or applications and hardware. SQL Server allows a maximum of 32,767 user connections. Because user connections is by default a self- configuring value, with SQL Server adjusting the maximum number of user connections automatically as needed, up to the maximum value allowable. For example, if only 10 users are logged in, 10 user connection objects are allocated. In most cases, you do not have to change the value for this option. The default is 0, which means that the maximum (32,767) user connections are allowed. However if there is a number defined here that limits connections, SQL Server will not allow anymore above this limit. If the connections are at the limit, any new requests will be dropped, potentially causing lost data or outages for those using the database.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags user connections=0",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-user-connection-flag.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended to check the user connections for a Cloud SQL SQL Server instance to ensure that it is not artificially limiting connections.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_user_connections_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'user connections' flag set to '0'"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "user connections" and flag["value"] == "0":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has not 'user connections' flag set to '0'"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"Provider": "gcp",
|
||||
"CheckID": "cloudsql_instance_sqlserver_user_options_flag",
|
||||
"CheckTitle": "Ensure 'user options' database flag for Cloud SQL SQL Server instance is not configured",
|
||||
"CheckType": [],
|
||||
"ServiceName": "cloudsql",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "DatabaseInstance",
|
||||
"Description": "Ensure 'user options' database flag for Cloud SQL SQL Server instance is not configured",
|
||||
"Risk": "The user options option specifies global defaults for all users. A list of default query processing options is established for the duration of a user's work session. The user options option allows you to change the default values of the SET options (if the server's default settings are not appropriate). A user can override these defaults by using the SET statement. You can configure user options dynamically for new logins. After you change the setting of user options, new login sessions use the new setting; current login sessions are not affected.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/user-options-flag-not-configured.html",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "It is recommended that, user options database flag for Cloud SQL SQL Server instance should not be configured.",
|
||||
"Url": "https://cloud.google.com/sql/docs/sqlserver/flags"
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
@@ -0,0 +1,24 @@
|
||||
from prowler.lib.check.models import Check, Check_Report_GCP
|
||||
from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client
|
||||
|
||||
|
||||
class cloudsql_instance_sqlserver_user_options_flag(Check):
|
||||
def execute(self) -> Check_Report_GCP:
|
||||
findings = []
|
||||
for instance in cloudsql_client.instances:
|
||||
if "SQLSERVER" in instance.version:
|
||||
report = Check_Report_GCP(self.metadata())
|
||||
report.project_id = cloudsql_client.project_id
|
||||
report.resource_id = instance.name
|
||||
report.resource_name = instance.name
|
||||
report.location = instance.region
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has not 'user options' flag set"
|
||||
for flag in instance.flags:
|
||||
if flag["name"] == "user options" and flag["value"] != "":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"SQL Server Instance {instance.name} has 'user options' flag set"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user