diff --git a/README.md b/README.md index 267b72ec..575b559b 100644 --- a/README.md +++ b/README.md @@ -41,7 +41,7 @@ It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, F # 📖 Documentation The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud) - + ## Looking for Prowler v2 documentation? For Prowler v2 Documentation, please go to https://github.com/prowler-cloud/prowler/tree/2.12.1. @@ -54,7 +54,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo pip install prowler prowler -v ``` -More details at https://docs.prowler.cloud +More details at https://docs.prowler.cloud ## Containers @@ -63,7 +63,7 @@ The available versions of Prowler are the following: - `latest`: in sync with master branch (bear in mind that it is not a stable version) - `` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases. - `stable`: this tag always point to the latest release. - + The container images are available here: - [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags) @@ -116,6 +116,22 @@ Those credentials must be associated to a user or role with proper permissions t > If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json). +## Google Cloud Platform + +Prowler will follow the same credentials search as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order): + +1. [GOOGLE_APPLICATION_CREDENTIALS environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC) +2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal) +3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa) + +Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials: + + - Viewer + - Security Reviewer + - Stackdriver Account Viewer + +> `prowler` will scan the project associated with the credentials. + ## Azure Prowler for Azure supports the following authentication types: @@ -229,6 +245,14 @@ prowler aws --profile custom-profile -f us-east-1 eu-south-2 ``` > By default, `prowler` will scan all AWS regions. +## Google Cloud Platform + +Optionally, you can provide the location of an application credential JSON file with the following argument: + +```console +prowler gcp --credentials-file path +``` + ## Azure With Azure you need to specify which auth method is going to be used: diff --git a/docs/getting-started/requirements.md b/docs/getting-started/requirements.md index 1da5ef13..610989bd 100644 --- a/docs/getting-started/requirements.md +++ b/docs/getting-started/requirements.md @@ -30,6 +30,24 @@ Those credentials must be associated to a user or role with proper permissions t > If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure you also attach the custom policy [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json). +## Google Cloud + +### GCP Authentication + +Prowler will follow the same credentials search as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order): + +1. [GOOGLE_APPLICATION_CREDENTIALS environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC) +2. [User credentials set up by using the Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal) +3. [The attached service account, returned by the metadata server](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa) + +Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the following roles to the member associated with the credentials: + + - Viewer + - Security Reviewer + - Stackdriver Account Viewer + +> `prowler` will scan the project associated with the credentials. + ## Azure Prowler for azure supports the following authentication types: diff --git a/docs/index.md b/docs/index.md index 6625ed60..75a127bd 100644 --- a/docs/index.md +++ b/docs/index.md @@ -16,7 +16,7 @@ For **Prowler v2 Documentation**, please go [here](https://github.com/prowler-cl ## About Prowler -**Prowler** is an Open Source security tool to perform AWS and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. +**Prowler** is an Open Source security tool to perform AWS, Azure and Google Cloud security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks. @@ -40,7 +40,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo * `Python >= 3.9` * `Python pip >= 3.9` - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials _Commands_: @@ -54,7 +54,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo _Requirements_: * Have `docker` installed: https://docs.docker.com/get-docker/. - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials * In the command below, change `-v` to your local directory path in order to access the reports. _Commands_: @@ -71,7 +71,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo _Requirements for Ubuntu 20.04.3 LTS_: - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials * Install python 3.9 with: `sudo apt-get install python3.9` * Remove python 3.8 to avoid conflicts if you can: `sudo apt-get remove python3.8` * Make sure you have the python3 distutils package installed: `sudo apt-get install python3-distutils` @@ -91,7 +91,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo _Requirements for Developers_: - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials * `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`) _Commands_: @@ -108,7 +108,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo _Requirements_: - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials * Latest Amazon Linux 2 should come with Python 3.9 already installed however it may need pip. Install Python pip 3.9 with: `sudo dnf install -y python3-pip`. * Make sure setuptools for python is already installed with: `pip3 install setuptools` @@ -125,7 +125,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo _Requirements_: * `Brew` installed in your Mac or Linux - * AWS and/or Azure credentials + * AWS, GCP and/or Azure credentials _Commands_: @@ -194,7 +194,7 @@ You can run Prowler from your workstation, an EC2 instance, Fargate or any other ![Architecture](img/architecture.png) ## Basic Usage -To run Prowler, you will need to specify the provider (e.g aws or azure): +To run Prowler, you will need to specify the provider (e.g aws, gcp or azure): > If no provider specified, AWS will be used for backward compatibility with most of v2 options. ```console @@ -226,6 +226,7 @@ For executing specific checks or services you can use options `-c`/`checks` or ` ```console prowler azure --checks storage_blob_public_access_level_is_disabled prowler aws --services s3 ec2 +prowler gcp --services iam compute ``` Also, checks and services can be excluded with options `-e`/`--excluded-checks` or `--excluded-services`: @@ -233,6 +234,7 @@ Also, checks and services can be excluded with options `-e`/`--excluded-checks` ```console prowler aws --excluded-checks s3_bucket_public_access prowler azure --excluded-services defender iam +prowler gcp --excluded-services kms ``` More options and executions methods that will save your time in [Miscelaneous](tutorials/misc.md). @@ -252,6 +254,14 @@ prowler aws --profile custom-profile -f us-east-1 eu-south-2 ``` > By default, `prowler` will scan all AWS regions. +### Google Cloud + +Optionally, you can provide the location of an application credential JSON file with the following argument: + +```console +prowler gcp --credentials-file path +``` + ### Azure With Azure you need to specify which auth method is going to be used: diff --git a/docs/tutorials/aws/securityhub.md b/docs/tutorials/aws/securityhub.md index d0e1be60..fdb1919d 100644 --- a/docs/tutorials/aws/securityhub.md +++ b/docs/tutorials/aws/securityhub.md @@ -13,7 +13,7 @@ Before sending findings to Prowler, you will need to perform next steps: - Using the AWS Management Console: ![Screenshot 2020-10-29 at 10 26 02 PM](https://user-images.githubusercontent.com/3985464/97634660-5ade3400-1a36-11eb-9a92-4a45cc98c158.png) 3. Allow Prowler to import its findings to AWS Security Hub by adding the policy below to the role or user running Prowler: - - [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/iam/prowler-security-hub.json) + - [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json) Once it is enabled, it is as simple as running the command below (for all regions): diff --git a/poetry.lock b/poetry.lock index f85ed809..73f09b22 100644 --- a/poetry.lock +++ b/poetry.lock @@ -334,6 +334,18 @@ urllib3 = ">=1.25.4,<1.27" [package.extras] crt = ["awscrt (==0.16.9)"] +[[package]] +name = "cachetools" +version = "5.3.0" +description = "Extensible memoizing collections and decorators" +category = "main" +optional = false +python-versions = "~=3.7" +files = [ + {file = "cachetools-5.3.0-py3-none-any.whl", hash = "sha256:429e1a1e845c008ea6c85aa35d4b98b65d6a9763eeef3e37e92728a12d1de9d4"}, + {file = "cachetools-5.3.0.tar.gz", hash = "sha256:13dfddc7b8df938c21a940dfa6557ce6e94a2f1cdfa58eb90c805721d58f2c14"}, +] + [[package]] name = "certifi" version = "2022.12.7" @@ -862,6 +874,108 @@ files = [ [package.dependencies] gitdb = ">=4.0.1,<5" +[[package]] +name = "google-api-core" +version = "2.11.0" +description = "Google API client core library" +category = "main" +optional = false +python-versions = ">=3.7" +files = [ + {file = "google-api-core-2.11.0.tar.gz", hash = "sha256:4b9bb5d5a380a0befa0573b302651b8a9a89262c1730e37bf423cec511804c22"}, + {file = "google_api_core-2.11.0-py3-none-any.whl", hash = "sha256:ce222e27b0de0d7bc63eb043b956996d6dccab14cc3b690aaea91c9cc99dc16e"}, +] + +[package.dependencies] +google-auth = ">=2.14.1,<3.0dev" +googleapis-common-protos = ">=1.56.2,<2.0dev" +protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev" +requests = ">=2.18.0,<3.0.0dev" + +[package.extras] +grpc = ["grpcio (>=1.33.2,<2.0dev)", "grpcio (>=1.49.1,<2.0dev)", "grpcio-status (>=1.33.2,<2.0dev)", "grpcio-status (>=1.49.1,<2.0dev)"] +grpcgcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"] +grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"] + +[[package]] +name = "google-api-python-client" +version = "2.81.0" +description = "Google API Client Library for Python" +category = "main" +optional = false +python-versions = ">=3.7" +files = [ + {file = "google-api-python-client-2.81.0.tar.gz", hash = "sha256:8faab0b9b19d3797b455d33320c643253b6761fd0d3f3adb54792ab155d0795a"}, + {file = "google_api_python_client-2.81.0-py2.py3-none-any.whl", hash = "sha256:ad6700ae3a76ead8956d7f30935978cea308530e342ad8c1e26a4e40fc05c054"}, +] + +[package.dependencies] +google-api-core = ">=1.31.5,<2.0.0 || >2.3.0,<3.0.0dev" +google-auth = ">=1.19.0,<3.0.0dev" +google-auth-httplib2 = ">=0.1.0" +httplib2 = ">=0.15.0,<1dev" +uritemplate = ">=3.0.1,<5" + +[[package]] +name = "google-auth" +version = "2.16.2" +description = "Google Authentication Library" +category = "main" +optional = false +python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*" +files = [ + {file = "google-auth-2.16.2.tar.gz", hash = "sha256:07e14f34ec288e3f33e00e2e3cc40c8942aa5d4ceac06256a28cd8e786591420"}, + {file = "google_auth-2.16.2-py2.py3-none-any.whl", hash = "sha256:2fef3cf94876d1a0e204afece58bb4d83fb57228aaa366c64045039fda6770a2"}, +] + +[package.dependencies] +cachetools = ">=2.0.0,<6.0" +pyasn1-modules = ">=0.2.1" +rsa = {version = ">=3.1.4,<5", markers = "python_version >= \"3.6\""} +six = ">=1.9.0" + +[package.extras] +aiohttp = ["aiohttp (>=3.6.2,<4.0.0dev)", "requests (>=2.20.0,<3.0.0dev)"] +enterprise-cert = ["cryptography (==36.0.2)", "pyopenssl (==22.0.0)"] +pyopenssl = ["cryptography (>=38.0.3)", "pyopenssl (>=20.0.0)"] +reauth = ["pyu2f (>=0.1.5)"] +requests = ["requests (>=2.20.0,<3.0.0dev)"] + +[[package]] +name = "google-auth-httplib2" +version = "0.1.0" +description = "Google Authentication Library: httplib2 transport" +category = "main" +optional = false +python-versions = "*" +files = [ + {file = "google-auth-httplib2-0.1.0.tar.gz", hash = "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac"}, + {file = "google_auth_httplib2-0.1.0-py2.py3-none-any.whl", hash = "sha256:31e49c36c6b5643b57e82617cb3e021e3e1d2df9da63af67252c02fa9c1f4a10"}, +] + +[package.dependencies] +google-auth = "*" +httplib2 = ">=0.15.0" +six = "*" + +[[package]] +name = "googleapis-common-protos" +version = "1.59.0" +description = "Common protobufs used in Google APIs" +category = "main" +optional = false +python-versions = ">=3.7" +files = [ + {file = "googleapis-common-protos-1.59.0.tar.gz", hash = "sha256:4168fcb568a826a52f23510412da405abd93f4d23ba544bb68d943b14ba3cb44"}, + {file = "googleapis_common_protos-1.59.0-py2.py3-none-any.whl", hash = "sha256:b287dc48449d1d41af0c69f4ea26242b5ae4c3d7249a38b0984c86a4caffff1f"}, +] + +[package.dependencies] +protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0dev" + +[package.extras] +grpc = ["grpcio (>=1.44.0,<2.0.0dev)"] + [[package]] name = "grapheme" version = "0.6.0" @@ -876,6 +990,21 @@ files = [ [package.extras] test = ["pytest", "sphinx", "sphinx-autobuild", "twine", "wheel"] +[[package]] +name = "httplib2" +version = "0.22.0" +description = "A comprehensive HTTP client library." +category = "main" +optional = false +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" +files = [ + {file = "httplib2-0.22.0-py3-none-any.whl", hash = "sha256:14ae0a53c1ba8f3d37e9e27cf37eabb0fb9980f435ba405d546948b009dd64dc"}, + {file = "httplib2-0.22.0.tar.gz", hash = "sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81"}, +] + +[package.dependencies] +pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0.2,<3.0.3 || >3.0.3,<4", markers = "python_version > \"3.0\""} + [[package]] name = "idna" version = "3.4" @@ -1580,6 +1709,56 @@ docs = ["sphinx (>=1.7.1)"] redis = ["redis"] tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "pytest-timeout (>=2.1.0)", "redis", "sphinx (>=6.0.0)"] +[[package]] +name = "protobuf" +version = "4.22.1" +description = "" +category = "main" +optional = false +python-versions = ">=3.7" +files = [ + {file = "protobuf-4.22.1-cp310-abi3-win32.whl", hash = "sha256:85aa9acc5a777adc0c21b449dafbc40d9a0b6413ff3a4f77ef9df194be7f975b"}, + {file = "protobuf-4.22.1-cp310-abi3-win_amd64.whl", hash = "sha256:8bc971d76c03f1dd49f18115b002254f2ddb2d4b143c583bb860b796bb0d399e"}, + {file = "protobuf-4.22.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:5917412347e1da08ce2939eb5cd60650dfb1a9ab4606a415b9278a1041fb4d19"}, + {file = "protobuf-4.22.1-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:9e12e2810e7d297dbce3c129ae5e912ffd94240b050d33f9ecf023f35563b14f"}, + {file = "protobuf-4.22.1-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:953fc7904ef46900262a26374b28c2864610b60cdc8b272f864e22143f8373c4"}, + {file = "protobuf-4.22.1-cp37-cp37m-win32.whl", hash = "sha256:6e100f7bc787cd0a0ae58dbf0ab8bbf1ee7953f862b89148b6cf5436d5e9eaa1"}, + {file = "protobuf-4.22.1-cp37-cp37m-win_amd64.whl", hash = "sha256:87a6393fa634f294bf24d1cfe9fdd6bb605cbc247af81b9b10c4c0f12dfce4b3"}, + {file = "protobuf-4.22.1-cp38-cp38-win32.whl", hash = "sha256:e3fb58076bdb550e75db06ace2a8b3879d4c4f7ec9dd86e4254656118f4a78d7"}, + {file = "protobuf-4.22.1-cp38-cp38-win_amd64.whl", hash = "sha256:651113695bc2e5678b799ee5d906b5d3613f4ccfa61b12252cfceb6404558af0"}, + {file = "protobuf-4.22.1-cp39-cp39-win32.whl", hash = "sha256:67b7d19da0fda2733702c2299fd1ef6cb4b3d99f09263eacaf1aa151d9d05f02"}, + {file = "protobuf-4.22.1-cp39-cp39-win_amd64.whl", hash = "sha256:b8700792f88e59ccecfa246fa48f689d6eee6900eddd486cdae908ff706c482b"}, + {file = "protobuf-4.22.1-py3-none-any.whl", hash = "sha256:3e19dcf4adbf608924d3486ece469dd4f4f2cf7d2649900f0efcd1a84e8fd3ba"}, + {file = "protobuf-4.22.1.tar.gz", hash = "sha256:dce7a55d501c31ecf688adb2f6c3f763cf11bc0be815d1946a84d74772ab07a7"}, +] + +[[package]] +name = "pyasn1" +version = "0.4.8" +description = "ASN.1 types and codecs" +category = "main" +optional = false +python-versions = "*" +files = [ + {file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"}, + {file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"}, +] + +[[package]] +name = "pyasn1-modules" +version = "0.2.8" +description = "A collection of ASN.1-based protocols modules." +category = "main" +optional = false +python-versions = "*" +files = [ + {file = "pyasn1-modules-0.2.8.tar.gz", hash = "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e"}, + {file = "pyasn1_modules-0.2.8-py2.py3-none-any.whl", hash = "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74"}, +] + +[package.dependencies] +pyasn1 = ">=0.4.6,<0.5.0" + [[package]] name = "pycodestyle" version = "2.10.0" @@ -1954,100 +2133,72 @@ pyyaml = "*" [[package]] name = "regex" -version = "2022.10.31" +version = "2023.3.22" description = "Alternative regular expression module, to replace re." category = "main" optional = true -python-versions = ">=3.6" +python-versions = ">=3.8" files = [ - {file = "regex-2022.10.31-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a8ff454ef0bb061e37df03557afda9d785c905dab15584860f982e88be73015f"}, - {file = "regex-2022.10.31-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1eba476b1b242620c266edf6325b443a2e22b633217a9835a52d8da2b5c051f9"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0e5af9a9effb88535a472e19169e09ce750c3d442fb222254a276d77808620b"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d03fe67b2325cb3f09be029fd5da8df9e6974f0cde2c2ac6a79d2634e791dd57"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9d0b68ac1743964755ae2d89772c7e6fb0118acd4d0b7464eaf3921c6b49dd4"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a45b6514861916c429e6059a55cf7db74670eaed2052a648e3e4d04f070e001"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8b0886885f7323beea6f552c28bff62cbe0983b9fbb94126531693ea6c5ebb90"}, - {file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:5aefb84a301327ad115e9d346c8e2760009131d9d4b4c6b213648d02e2abe144"}, - {file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:702d8fc6f25bbf412ee706bd73019da5e44a8400861dfff7ff31eb5b4a1276dc"}, - {file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:a3c1ebd4ed8e76e886507c9eddb1a891673686c813adf889b864a17fafcf6d66"}, - {file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:50921c140561d3db2ab9f5b11c5184846cde686bb5a9dc64cae442926e86f3af"}, - {file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:7db345956ecce0c99b97b042b4ca7326feeec6b75facd8390af73b18e2650ffc"}, - {file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:763b64853b0a8f4f9cfb41a76a4a85a9bcda7fdda5cb057016e7706fde928e66"}, - {file = "regex-2022.10.31-cp310-cp310-win32.whl", hash = "sha256:44136355e2f5e06bf6b23d337a75386371ba742ffa771440b85bed367c1318d1"}, - {file = "regex-2022.10.31-cp310-cp310-win_amd64.whl", hash = "sha256:bfff48c7bd23c6e2aec6454aaf6edc44444b229e94743b34bdcdda2e35126cf5"}, - {file = "regex-2022.10.31-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4b4b1fe58cd102d75ef0552cf17242705ce0759f9695334a56644ad2d83903fe"}, - {file = "regex-2022.10.31-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:542e3e306d1669b25936b64917285cdffcd4f5c6f0247636fec037187bd93542"}, - {file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c27cc1e4b197092e50ddbf0118c788d9977f3f8f35bfbbd3e76c1846a3443df7"}, - {file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b8e38472739028e5f2c3a4aded0ab7eadc447f0d84f310c7a8bb697ec417229e"}, - {file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:76c598ca73ec73a2f568e2a72ba46c3b6c8690ad9a07092b18e48ceb936e9f0c"}, - {file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c28d3309ebd6d6b2cf82969b5179bed5fefe6142c70f354ece94324fa11bf6a1"}, - {file = "regex-2022.10.31-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9af69f6746120998cd9c355e9c3c6aec7dff70d47247188feb4f829502be8ab4"}, - {file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a5f9505efd574d1e5b4a76ac9dd92a12acb2b309551e9aa874c13c11caefbe4f"}, - {file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:5ff525698de226c0ca743bfa71fc6b378cda2ddcf0d22d7c37b1cc925c9650a5"}, - {file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:4fe7fda2fe7c8890d454f2cbc91d6c01baf206fbc96d89a80241a02985118c0c"}, - {file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:2cdc55ca07b4e70dda898d2ab7150ecf17c990076d3acd7a5f3b25cb23a69f1c"}, - {file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:44a6c2f6374e0033873e9ed577a54a3602b4f609867794c1a3ebba65e4c93ee7"}, - {file = "regex-2022.10.31-cp311-cp311-win32.whl", hash = "sha256:d8716f82502997b3d0895d1c64c3b834181b1eaca28f3f6336a71777e437c2af"}, - {file = "regex-2022.10.31-cp311-cp311-win_amd64.whl", hash = "sha256:61edbca89aa3f5ef7ecac8c23d975fe7261c12665f1d90a6b1af527bba86ce61"}, - {file = "regex-2022.10.31-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:0a069c8483466806ab94ea9068c34b200b8bfc66b6762f45a831c4baaa9e8cdd"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d26166acf62f731f50bdd885b04b38828436d74e8e362bfcb8df221d868b5d9b"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac741bf78b9bb432e2d314439275235f41656e189856b11fb4e774d9f7246d81"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75f591b2055523fc02a4bbe598aa867df9e953255f0b7f7715d2a36a9c30065c"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b30bddd61d2a3261f025ad0f9ee2586988c6a00c780a2fb0a92cea2aa702c54"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef4163770525257876f10e8ece1cf25b71468316f61451ded1a6f44273eedeb5"}, - {file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:7b280948d00bd3973c1998f92e22aa3ecb76682e3a4255f33e1020bd32adf443"}, - {file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:d0213671691e341f6849bf33cd9fad21f7b1cb88b89e024f33370733fec58742"}, - {file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:22e7ebc231d28393dfdc19b185d97e14a0f178bedd78e85aad660e93b646604e"}, - {file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:8ad241da7fac963d7573cc67a064c57c58766b62a9a20c452ca1f21050868dfa"}, - {file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:586b36ebda81e6c1a9c5a5d0bfdc236399ba6595e1397842fd4a45648c30f35e"}, - {file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:0653d012b3bf45f194e5e6a41df9258811ac8fc395579fa82958a8b76286bea4"}, - {file = "regex-2022.10.31-cp36-cp36m-win32.whl", hash = "sha256:144486e029793a733e43b2e37df16a16df4ceb62102636ff3db6033994711066"}, - {file = "regex-2022.10.31-cp36-cp36m-win_amd64.whl", hash = "sha256:c14b63c9d7bab795d17392c7c1f9aaabbffd4cf4387725a0ac69109fb3b550c6"}, - {file = "regex-2022.10.31-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4cac3405d8dda8bc6ed499557625585544dd5cbf32072dcc72b5a176cb1271c8"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23cbb932cc53a86ebde0fb72e7e645f9a5eec1a5af7aa9ce333e46286caef783"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74bcab50a13960f2a610cdcd066e25f1fd59e23b69637c92ad470784a51b1347"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78d680ef3e4d405f36f0d6d1ea54e740366f061645930072d39bca16a10d8c93"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce6910b56b700bea7be82c54ddf2e0ed792a577dfaa4a76b9af07d550af435c6"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:659175b2144d199560d99a8d13b2228b85e6019b6e09e556209dfb8c37b78a11"}, - {file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1ddf14031a3882f684b8642cb74eea3af93a2be68893901b2b387c5fd92a03ec"}, - {file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b683e5fd7f74fb66e89a1ed16076dbab3f8e9f34c18b1979ded614fe10cdc4d9"}, - {file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2bde29cc44fa81c0a0c8686992c3080b37c488df167a371500b2a43ce9f026d1"}, - {file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:4919899577ba37f505aaebdf6e7dc812d55e8f097331312db7f1aab18767cce8"}, - {file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:9c94f7cc91ab16b36ba5ce476f1904c91d6c92441f01cd61a8e2729442d6fcf5"}, - {file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ae1e96785696b543394a4e3f15f3f225d44f3c55dafe3f206493031419fedf95"}, - {file = "regex-2022.10.31-cp37-cp37m-win32.whl", hash = "sha256:c670f4773f2f6f1957ff8a3962c7dd12e4be54d05839b216cb7fd70b5a1df394"}, - {file = "regex-2022.10.31-cp37-cp37m-win_amd64.whl", hash = "sha256:8e0caeff18b96ea90fc0eb6e3bdb2b10ab5b01a95128dfeccb64a7238decf5f0"}, - {file = "regex-2022.10.31-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:131d4be09bea7ce2577f9623e415cab287a3c8e0624f778c1d955ec7c281bd4d"}, - {file = "regex-2022.10.31-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e613a98ead2005c4ce037c7b061f2409a1a4e45099edb0ef3200ee26ed2a69a8"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:052b670fafbe30966bbe5d025e90b2a491f85dfe5b2583a163b5e60a85a321ad"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aa62a07ac93b7cb6b7d0389d8ef57ffc321d78f60c037b19dfa78d6b17c928ee"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5352bea8a8f84b89d45ccc503f390a6be77917932b1c98c4cdc3565137acc714"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20f61c9944f0be2dc2b75689ba409938c14876c19d02f7585af4460b6a21403e"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:29c04741b9ae13d1e94cf93fca257730b97ce6ea64cfe1eba11cf9ac4e85afb6"}, - {file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:543883e3496c8b6d58bd036c99486c3c8387c2fc01f7a342b760c1ea3158a318"}, - {file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b7a8b43ee64ca8f4befa2bea4083f7c52c92864d8518244bfa6e88c751fa8fff"}, - {file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:6a9a19bea8495bb419dc5d38c4519567781cd8d571c72efc6aa959473d10221a"}, - {file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:6ffd55b5aedc6f25fd8d9f905c9376ca44fcf768673ffb9d160dd6f409bfda73"}, - {file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:4bdd56ee719a8f751cf5a593476a441c4e56c9b64dc1f0f30902858c4ef8771d"}, - {file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8ca88da1bd78990b536c4a7765f719803eb4f8f9971cc22d6ca965c10a7f2c4c"}, - {file = "regex-2022.10.31-cp38-cp38-win32.whl", hash = "sha256:5a260758454580f11dd8743fa98319bb046037dfab4f7828008909d0aa5292bc"}, - {file = "regex-2022.10.31-cp38-cp38-win_amd64.whl", hash = "sha256:5e6a5567078b3eaed93558842346c9d678e116ab0135e22eb72db8325e90b453"}, - {file = "regex-2022.10.31-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5217c25229b6a85049416a5c1e6451e9060a1edcf988641e309dbe3ab26d3e49"}, - {file = "regex-2022.10.31-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4bf41b8b0a80708f7e0384519795e80dcb44d7199a35d52c15cc674d10b3081b"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cf0da36a212978be2c2e2e2d04bdff46f850108fccc1851332bcae51c8907cc"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d403d781b0e06d2922435ce3b8d2376579f0c217ae491e273bab8d092727d244"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a37d51fa9a00d265cf73f3de3930fa9c41548177ba4f0faf76e61d512c774690"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4f781ffedd17b0b834c8731b75cce2639d5a8afe961c1e58ee7f1f20b3af185"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d243b36fbf3d73c25e48014961e83c19c9cc92530516ce3c43050ea6276a2ab7"}, - {file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:370f6e97d02bf2dd20d7468ce4f38e173a124e769762d00beadec3bc2f4b3bc4"}, - {file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:597f899f4ed42a38df7b0e46714880fb4e19a25c2f66e5c908805466721760f5"}, - {file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7dbdce0c534bbf52274b94768b3498abdf675a691fec5f751b6057b3030f34c1"}, - {file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:22960019a842777a9fa5134c2364efaed5fbf9610ddc5c904bd3a400973b0eb8"}, - {file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:7f5a3ffc731494f1a57bd91c47dc483a1e10048131ffb52d901bfe2beb6102e8"}, - {file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7ef6b5942e6bfc5706301a18a62300c60db9af7f6368042227ccb7eeb22d0892"}, - {file = "regex-2022.10.31-cp39-cp39-win32.whl", hash = "sha256:395161bbdbd04a8333b9ff9763a05e9ceb4fe210e3c7690f5e68cedd3d65d8e1"}, - {file = "regex-2022.10.31-cp39-cp39-win_amd64.whl", hash = "sha256:957403a978e10fb3ca42572a23e6f7badff39aa1ce2f4ade68ee452dc6807692"}, - {file = "regex-2022.10.31.tar.gz", hash = "sha256:a3a98921da9a1bf8457aeee6a551948a83601689e5ecdd736894ea9bbec77e83"}, + {file = "regex-2023.3.22-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:68e9add923bda8357e6fe65a568766feae369063cb7210297067675cce65272f"}, + {file = "regex-2023.3.22-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b280cb303fed94199f0b976595af71ebdcd388fb5e377a8198790f1016a23476"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:328a70e578f37f59eb54e8450b5042190bbadf2ef7f5c0b60829574b62955ed7"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c00c357a4914f58398503c7f716cf1646b1e36b8176efa35255f5ebfacedfa46"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d40cecf4bcb2cb37c59e3c79e5bbc45d47e3f3e07edf24e35fc5775db2570058"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:43469c22fcf705a7cb59c7e01d6d96975bdbc54c1138900f04d11496489a0054"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d4d3571c8eb21f0fbe9f0b21b49092c24d442f9a295f079949df3551b2886f29"}, + {file = "regex-2023.3.22-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:148ad520f41021b97870e9c80420e6cdaadcc5e4306e613aed84cd5d53f8a7ca"}, + {file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:24242e5f26823e95edd64969bd206d4752c1a56a744d8cbcf58461f9788bc0c7"}, + {file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:60fcef5c3144d861b623456d87ca7fff7af59a4a918e1364cdd0687b48285285"}, + {file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:533ba64d67d882286557106a1c5f12b4c2825f11b47a7c209a8c22922ca882be"}, + {file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:80a288b21b17e39fb3630cf1d14fd704499bb11d9c8fc110662a0c57758d3d3e"}, + {file = "regex-2023.3.22-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fa41a427d4f03ec6d6da2fd8a230f4f388f336cd7ca46b46c4d2a1bca3ead85a"}, + {file = "regex-2023.3.22-cp310-cp310-win32.whl", hash = "sha256:3c4fa90fd91cc2957e66195ce374331bebbc816964864f64b42bd14bda773b53"}, + {file = "regex-2023.3.22-cp310-cp310-win_amd64.whl", hash = "sha256:a4c7b8c5a3a186b49415af3be18e4b8f93b33d6853216c0a1d7401736b703bce"}, + {file = "regex-2023.3.22-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0a2a851d0548a4e298d88e3ceeb4bad4aab751cf1883edf6150f25718ce0207a"}, + {file = "regex-2023.3.22-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f2bc8a9076ea7add860d57dbee0554a212962ecf2a900344f2fc7c56a02463b0"}, + {file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e30d9a6fd7a7a6a4da6f80d167ce8eda4a993ff24282cbc73f34186c46a498db"}, + {file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3371975b165c1e859e1990e5069e8606f00b25aed961cfd25b7bac626b1eb5a9"}, + {file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33c887b658afb144cdc8ce9156a0e1098453060c18b8bd5177f831ad58e0d60d"}, + {file = "regex-2023.3.22-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd47362e03acc780aad5a5bc4624d495594261b55a1f79a5b775b6be865a5911"}, + {file = "regex-2023.3.22-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7798b3d662f70cea425637c54da30ef1894d426cab24ee7ffaaccb24a8b17bb8"}, + {file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bdab2c90665b88faf5cc5e11bf835d548f4b8d8060c89fc70782b6020850aa1c"}, + {file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:55f907c4d18a5a40da0ceb339a0beda77c9df47c934adad987793632fb4318c3"}, + {file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e00b046000b313ffaa2f6e8d7290b33b08d2005150eff4c8cf3ad74d011888d1"}, + {file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:20ce96da2093e72e151d6af8217a629aeb5f48f1ac543c2fffd1d87c57699d7e"}, + {file = "regex-2023.3.22-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8527ea0978ed6dc58ccb3935bd2883537b455c97ec44b5d8084677dfa817f96b"}, + {file = "regex-2023.3.22-cp311-cp311-win32.whl", hash = "sha256:4c9c3db90acd17e4231344a23616f33fd79837809584ce30e2450ca312fa47aa"}, + {file = "regex-2023.3.22-cp311-cp311-win_amd64.whl", hash = "sha256:e1b56dac5e86ab52e0443d63b02796357202a8f8c5966b69f8d4c03a94778e98"}, + {file = "regex-2023.3.22-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:33bab9c9af936123b70b9874ce83f2bcd54be76b97637b33d31560fba8ad5d78"}, + {file = "regex-2023.3.22-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b59233cb8df6b60fff5f3056f6f342a8f5f04107a11936bf49ebff87dd4ace34"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f6f29cb134d782685f8eda01d72073c483c7f87b318b5101c7001faef7850f5"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d15a0cc48f7a3055e89df1bd6623a907c407d1f58f67ff47064e598d4a550de4"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:159c7b83488a056365119ada0bceddc06a455d3db7a7aa3cf07f13b2878b885f"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aff7c778d9229d66f716ad98a701fa91cf97935ae4a32a145ae9e61619906aaa"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3e66cfc915f5f7e2c8a0af8a27f87aa857f440de7521fd7f2682e23f082142a1"}, + {file = "regex-2023.3.22-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3b4da28d89527572f0d4a24814e353e1228a7aeda965e5d9265c1435a154b17a"}, + {file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5da83c964aecb6c3f2a6c9a03f3d0fa579e1ad208e2c264ba826cecd19da11fa"}, + {file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:81291006a934052161eae8340e7731ea6b8595b0c27dd4927c4e8a489e1760e2"}, + {file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:c95a977cfdccb8ddef95ddd77cf586fe9dc327c7c93cf712983cece70cdaa1be"}, + {file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:cdd3d2df486c9a8c6d08f78bdfa8ea7cf6191e037fde38c2cf6f5f0559e9d353"}, + {file = "regex-2023.3.22-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f311ca33fcb9f8fb060c1fa76238d8d029f33b71a2021bafa5d423cc25965b54"}, + {file = "regex-2023.3.22-cp38-cp38-win32.whl", hash = "sha256:2e2e6baf4a1108f84966f44870b26766d8f6d104c9959aae329078327c677122"}, + {file = "regex-2023.3.22-cp38-cp38-win_amd64.whl", hash = "sha256:60b545806a433cc752b9fa936f1c0a63bf96a3872965b958b35bd0d5d788d411"}, + {file = "regex-2023.3.22-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5826e7fb443acb49f64f9648a2852efc8d9af2f4c67f6c3dca69dccd9e8e1d15"}, + {file = "regex-2023.3.22-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:59b3aab231c27cd754d6452c43b12498d34e7ab87d69a502bd0220f4b1c090c4"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:97326d62255203c6026896d4b1ad6b5a0141ba097cae00ed3a508fe454e96baf"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59a15c2803c20702d7f2077807d9a2b7d9a168034b87fd3f0d8361de60019a1e"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ad467524cb6879ce42107cf02a49cdb4a06f07fe0e5f1160d7db865a8d25d4b"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:617d101b95151d827d5366e9c4225a68c64d56065e41ab9c7ef51bb87f347a8a"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:548257463696daf919d2fdfc53ee4b98e29e3ffc5afddd713d83aa849d1fa178"}, + {file = "regex-2023.3.22-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1937946dd03818845bd9c1713dfd3173a7b9a324e6593a235fc8c51c9cd460eb"}, + {file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d94a0d25e517c76c9ce9e2e2635d9d1a644b894f466a66a10061f4e599cdc019"}, + {file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:87016850c13082747bd120558e6750746177bd492b103b2fca761c8a1c43fba9"}, + {file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:3582db55372eaee9e998d378109c4b9b15beb2c84624c767efe351363fada9c4"}, + {file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:88552925fd22320600c59ee80342d6eb06bfa9503c3a402d7327983f5fa999d9"}, + {file = "regex-2023.3.22-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8d7477ebaf5d3621c763702e1ec0daeede8863fb22459c5e26ddfd17e9b1999c"}, + {file = "regex-2023.3.22-cp39-cp39-win32.whl", hash = "sha256:dcc5b0d6a94637c071a427dc4469efd0ae4fda8ff384790bc8b5baaf9308dc3e"}, + {file = "regex-2023.3.22-cp39-cp39-win_amd64.whl", hash = "sha256:f1977c1fe28173f2349d42c59f80f10a97ce34f2bedb7b7f55e2e8a8de9b7dfb"}, + {file = "regex-2023.3.22.tar.gz", hash = "sha256:f579a202b90c1110d0894a86b32a89bf550fdb34bdd3f9f550115706be462e19"}, ] [[package]] @@ -2146,6 +2297,21 @@ pygments = ">=2.13.0,<3.0.0" [package.extras] jupyter = ["ipywidgets (>=7.5.1,<9)"] +[[package]] +name = "rsa" +version = "4.9" +description = "Pure-Python RSA implementation" +category = "main" +optional = false +python-versions = ">=3.6,<4" +files = [ + {file = "rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7"}, + {file = "rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21"}, +] + +[package.dependencies] +pyasn1 = ">=0.1.3" + [[package]] name = "ruamel-yaml" version = "0.17.21" @@ -2432,6 +2598,18 @@ files = [ {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"}, ] +[[package]] +name = "uritemplate" +version = "4.1.1" +description = "Implementation of RFC 6570 URI Templates" +category = "main" +optional = false +python-versions = ">=3.6" +files = [ + {file = "uritemplate-4.1.1-py2.py3-none-any.whl", hash = "sha256:830c08b8d99bdd312ea4ead05994a38e8936266f84b9a7878232db50b044e02e"}, + {file = "uritemplate-4.1.1.tar.gz", hash = "sha256:4346edfc5c3b79f694bccd6d6099a322bbeb628dbf2cd86eea55a456ce5124f0"}, +] + [[package]] name = "urllib3" version = "1.26.15" @@ -2670,4 +2848,4 @@ docs = ["mkdocs", "mkdocs-material"] [metadata] lock-version = "2.0" python-versions = "^3.9" -content-hash = "ab9263db7c7c836f9192444bcf680a4183d16ae49cff7d60082e61a80973001d" +content-hash = "245d449b4b8d1cbe6be0a125761312a6a53bf1f050440a41ffe8034945064fb8" diff --git a/prowler/compliance/gcp/__init__.py b/prowler/compliance/gcp/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/config/config.py b/prowler/config/config.py index 5ee60c61..7d8de5bd 100644 --- a/prowler/config/config.py +++ b/prowler/config/config.py @@ -33,6 +33,8 @@ with os.scandir(compliance_aws_dir) as files: # AWS services-regions matrix json aws_services_json_file = "aws_regions_by_service.json" +# gcp_zones_json_file = "gcp_zones.json" + default_output_directory = getcwd() + "/output" output_file_timestamp = timestamp.strftime("%Y%m%d%H%M%S") diff --git a/prowler/lib/check/models.py b/prowler/lib/check/models.py index 6a63da49..c89d0142 100644 --- a/prowler/lib/check/models.py +++ b/prowler/lib/check/models.py @@ -128,6 +128,23 @@ class Check_Report_Azure(Check_Report): self.subscription = "" +@dataclass +class Check_Report_GCP(Check_Report): + """Contains the GCP Check's finding information.""" + + resource_name: str + resource_id: str + project_id: str + location: str + + def __init__(self, metadata): + super().__init__(metadata) + self.resource_name = "" + self.resource_id = "" + self.project_id = "" + self.location = "" + + # Testing Pending def load_check_metadata(metadata_file: str) -> Check_Metadata_Model: """load_check_metadata loads and parse a Check's metadata file""" diff --git a/prowler/lib/cli/parser.py b/prowler/lib/cli/parser.py index c04fe072..71778b52 100644 --- a/prowler/lib/cli/parser.py +++ b/prowler/lib/cli/parser.py @@ -58,6 +58,7 @@ Detailed documentation at https://docs.prowler.cloud # Init Providers Arguments self.__init_aws_parser__() self.__init_azure_parser__() + self.__init_gcp_parser__() def parse(self, args=None) -> argparse.Namespace: """ @@ -431,3 +432,18 @@ Detailed documentation at https://docs.prowler.cloud default=[], help="Azure subscription ids to be scanned by prowler", ) + + def __init_gcp_parser__(self): + """Init the GCP Provider CLI parser""" + gcp_parser = self.subparsers.add_parser( + "gcp", parents=[self.common_providers_parser], help="GCP Provider" + ) + # Authentication Modes + gcp_auth_subparser = gcp_parser.add_argument_group("Authentication Modes") + gcp_auth_modes_group = gcp_auth_subparser.add_mutually_exclusive_group() + gcp_auth_modes_group.add_argument( + "--credentials-file", + nargs="?", + metavar="FILE_PATH", + help="Authenticate using a Google Service Account Application Credentials JSON file", + ) diff --git a/prowler/lib/outputs/file_descriptors.py b/prowler/lib/outputs/file_descriptors.py index dd06ffa9..5fda584e 100644 --- a/prowler/lib/outputs/file_descriptors.py +++ b/prowler/lib/outputs/file_descriptors.py @@ -16,11 +16,13 @@ from prowler.lib.outputs.models import ( Check_Output_CSV_CIS, Check_Output_CSV_ENS_RD2022, Check_Output_CSV_Generic_Compliance, + Gcp_Check_Output_CSV, generate_csv_fields, ) from prowler.lib.utils.utils import file_exists, open_file from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info +from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info def initialize_file_descriptor( @@ -82,6 +84,13 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit audit_info, Azure_Check_Output_CSV, ) + if isinstance(audit_info, GCP_Audit_Info): + file_descriptor = initialize_file_descriptor( + filename, + output_mode, + audit_info, + Gcp_Check_Output_CSV, + ) file_descriptors.update({output_mode: file_descriptor}) elif output_mode == "json": diff --git a/prowler/lib/outputs/models.py b/prowler/lib/outputs/models.py index cfec510d..7d623a1b 100644 --- a/prowler/lib/outputs/models.py +++ b/prowler/lib/outputs/models.py @@ -56,6 +56,19 @@ def generate_provider_output_csv( ) finding_output = output_model(**data) + if provider == "gcp": + data["resource_id"] = finding.resource_id + data["resource_name"] = finding.resource_name + data["project_id"] = finding.project_id + data["location"] = finding.location + data[ + "finding_unique_id" + ] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.project_id}-{finding.resource_id}" + data["compliance"] = unroll_dict( + get_check_compliance(finding, provider, output_options) + ) + finding_output = output_model(**data) + if provider == "aws": data["profile"] = audit_info.profile data["account_id"] = audit_info.audited_account @@ -305,6 +318,17 @@ class Azure_Check_Output_CSV(Check_Output_CSV): resource_name: str = "" +class Gcp_Check_Output_CSV(Check_Output_CSV): + """ + Gcp_Check_Output_CSV generates a finding's output in CSV format for the GCP provider. + """ + + project_id: str = "" + location: str = "" + resource_id: str = "" + resource_name: str = "" + + def generate_provider_output_json( provider: str, finding, audit_info, mode: str, output_options ): @@ -333,6 +357,16 @@ def generate_provider_output_json( finding, provider, output_options ) + if provider == "gcp": + finding_output.ProjectId = audit_info.project_id + finding_output.Location = finding.location + finding_output.ResourceId = finding.resource_id + finding_output.ResourceName = finding.resource_name + finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.project_id}-{finding.resource_id}" + finding_output.Compliance = get_check_compliance( + finding, provider, output_options + ) + if provider == "aws": finding_output.Profile = audit_info.profile finding_output.AccountId = audit_info.audited_account @@ -421,6 +455,20 @@ class Azure_Check_Output_JSON(Check_Output_JSON): super().__init__(**metadata) +class Gcp_Check_Output_JSON(Check_Output_JSON): + """ + Gcp_Check_Output_JSON generates a finding's output in JSON format for the AWS provider. + """ + + ProjectId: str = "" + ResourceId: str = "" + ResourceName: str = "" + Location: str = "" + + def __init__(self, **metadata): + super().__init__(**metadata) + + class Check_Output_CSV_ENS_RD2022(BaseModel): """ Check_Output_CSV_ENS_RD2022 generates a finding's output in CSV ENS RD2022 format. diff --git a/prowler/lib/outputs/outputs.py b/prowler/lib/outputs/outputs.py index 9562493a..3a1ffb59 100644 --- a/prowler/lib/outputs/outputs.py +++ b/prowler/lib/outputs/outputs.py @@ -33,6 +33,8 @@ def stdout_report(finding, color, verbose, is_quiet): details = finding.region if finding.check_metadata.Provider == "azure": details = finding.check_metadata.ServiceName + if finding.check_metadata.Provider == "gcp": + details = finding.location if verbose and not (is_quiet and finding.status != "FAIL"): print( diff --git a/prowler/lib/outputs/summary_table.py b/prowler/lib/outputs/summary_table.py index f503f621..2e0bc6fa 100644 --- a/prowler/lib/outputs/summary_table.py +++ b/prowler/lib/outputs/summary_table.py @@ -26,6 +26,9 @@ def display_summary_table( else: entity_type = "Tenant ID/s" audited_entities = " ".join(audit_info.identity.tenant_ids) + elif provider == "gcp": + entity_type = "Project ID" + audited_entities = audit_info.project_id if findings: current = { @@ -53,7 +56,6 @@ def display_summary_table( current["Service"] != finding.check_metadata.ServiceName and current["Service"] ): - add_service_to_table(findings_table, current) current["Total"] = current["Critical"] = current["High"] = current[ diff --git a/prowler/providers/common/audit_info.py b/prowler/providers/common/audit_info.py index b51106b3..149517fe 100644 --- a/prowler/providers/common/audit_info.py +++ b/prowler/providers/common/audit_info.py @@ -25,6 +25,9 @@ from prowler.providers.aws.lib.resource_api_tagging.resource_api_tagging import from prowler.providers.azure.azure_provider import Azure_Provider from prowler.providers.azure.lib.audit_info.audit_info import azure_audit_info from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info +from prowler.providers.gcp.gcp_provider import GCP_Provider +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info class Audit_Info: @@ -41,7 +44,7 @@ class Audit_Info: else: return caller_identity - def print_audit_credentials(self, audit_info: AWS_Audit_Info): + def print_aws_credentials(self, audit_info: AWS_Audit_Info): # Beautify audited regions, set "all" if there is no filter region regions = ( ", ".join(audit_info.audited_regions) @@ -61,6 +64,25 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE # If -A is set, print Assumed Role ARN if audit_info.assumed_role_info.role_arn is not None: report += f"""Assumed Role ARN: {Fore.YELLOW}[{audit_info.assumed_role_info.role_arn}]{Style.RESET_ALL} +""" + print(report) + + def print_gcp_credentials(self, audit_info: GCP_Audit_Info): + # Beautify audited profile, set "default" if there is no profile set + try: + getattr(audit_info.credentials, "_service_account_email") + profile = ( + audit_info.credentials._service_account_email + if audit_info.credentials._service_account_email is not None + else "default" + ) + except AttributeError: + profile = "default" + + report = f""" +This report is being generated using credentials below: + +GCP Account: {Fore.YELLOW}[{profile}]{Style.RESET_ALL} GCP Project ID: {Fore.YELLOW}[{audit_info.project_id}]{Style.RESET_ALL} """ print(report) @@ -257,7 +279,7 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE current_audit_info.profile_region = "us-east-1" if not arguments.get("only_logs"): - self.print_audit_credentials(current_audit_info) + self.print_aws_credentials(current_audit_info) # Parse Scan Tags if arguments.get("resource_tags"): @@ -320,6 +342,29 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE return azure_audit_info + def set_gcp_audit_info(self, arguments) -> GCP_Audit_Info: + """ + set_gcp_audit_info returns the GCP_Audit_Info + """ + logger.info("Setting GCP session ...") + + logger.info("Checking if any credentials mode is set ...") + credentials_file = arguments.get("credentials_file") + + gcp_provider = GCP_Provider( + credentials_file, + ) + + ( + gcp_audit_info.credentials, + gcp_audit_info.project_id, + ) = gcp_provider.get_credentials() + + if not arguments.get("only_logs"): + self.print_gcp_credentials(gcp_audit_info) + + return gcp_audit_info + def set_provider_audit_info(provider: str, arguments: dict): """ diff --git a/prowler/providers/common/outputs.py b/prowler/providers/common/outputs.py index 72d4fe11..81a6d024 100644 --- a/prowler/providers/common/outputs.py +++ b/prowler/providers/common/outputs.py @@ -77,6 +77,27 @@ class Azure_Output_Options(Provider_Output_Options): arguments.output_modes.remove("html") +class Gcp_Output_Options(Provider_Output_Options): + def __init__(self, arguments, audit_info, allowlist_file, bulk_checks_metadata): + # First call Provider_Output_Options init + super().__init__(arguments, allowlist_file, bulk_checks_metadata) + + # Check if custom output filename was input, if not, set the default + if ( + not hasattr(arguments, "output_filename") + or arguments.output_filename is None + ): + self.output_filename = ( + f"prowler-output-{audit_info.project_id}-{output_file_timestamp}" + ) + else: + self.output_filename = arguments.output_filename + + # Remove HTML Output since it is not supported yet + if "html" in arguments.output_modes: + arguments.output_modes.remove("html") + + class Aws_Output_Options(Provider_Output_Options): security_hub_enabled: bool diff --git a/prowler/providers/gcp/__init__.py b/prowler/providers/gcp/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/gcp_provider.py b/prowler/providers/gcp/gcp_provider.py new file mode 100644 index 00000000..74b006ef --- /dev/null +++ b/prowler/providers/gcp/gcp_provider.py @@ -0,0 +1,51 @@ +import os +import sys + +from google import auth +from googleapiclient import discovery +from googleapiclient.discovery import Resource + +from prowler.lib.logger import logger +from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info + + +class GCP_Provider: + def __init__( + self, + credentials_file: str, + ): + logger.info("Instantiating GCP Provider ...") + self.credentials, self.project_id = self.__set_credentials__(credentials_file) + + def __set_credentials__(self, credentials_file): + try: + if credentials_file: + self.__set_gcp_creds_env_var__(credentials_file) + + return auth.default() + except Exception as error: + logger.critical(f"{error.__class__.__name__} -- {error}") + sys.exit(1) + + def __set_gcp_creds_env_var__(self, credentials_file): + logger.info( + "GCP provider: Setting GOOGLE_APPLICATION_CREDENTIALS environment variable..." + ) + client_secrets_path = os.path.abspath(credentials_file) + os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = client_secrets_path + + def get_credentials(self): + return self.credentials, self.project_id + + +def generate_client( + service: str, + api_version: str, + audit_info: GCP_Audit_Info, +) -> Resource: + try: + return discovery.build(service, api_version, credentials=audit_info.credentials) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) diff --git a/prowler/providers/gcp/lib/audit_info/__init__.py b/prowler/providers/gcp/lib/audit_info/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/lib/audit_info/audit_info.py b/prowler/providers/gcp/lib/audit_info/audit_info.py new file mode 100644 index 00000000..45a01243 --- /dev/null +++ b/prowler/providers/gcp/lib/audit_info/audit_info.py @@ -0,0 +1,8 @@ +from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info + +gcp_audit_info = GCP_Audit_Info( + credentials=None, + project_id=None, + audit_resources=None, + audit_metadata=None, +) diff --git a/prowler/providers/gcp/lib/audit_info/models.py b/prowler/providers/gcp/lib/audit_info/models.py new file mode 100644 index 00000000..192e110a --- /dev/null +++ b/prowler/providers/gcp/lib/audit_info/models.py @@ -0,0 +1,18 @@ +from dataclasses import dataclass +from typing import Any, Optional + +from google.oauth2.credentials import Credentials + + +@dataclass +class GCP_Audit_Info: + credentials: Credentials + project_id: str + audit_resources: Optional[Any] + audit_metadata: Optional[Any] + + def __init__(self, credentials, project_id, audit_metadata, audit_resources): + self.credentials = credentials + self.project_id = project_id + self.audit_metadata = audit_metadata + self.audit_resources = audit_resources diff --git a/prowler/providers/gcp/services/bigquery/__init__.py b/prowler/providers/gcp/services/bigquery/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/bigquery/bigquery_client.py b/prowler/providers/gcp/services/bigquery/bigquery_client.py new file mode 100644 index 00000000..fca7c124 --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.bigquery.bigquery_service import BigQuery + +bigquery_client = BigQuery(gcp_audit_info) diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/__init__.py b/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.metadata.json b/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.metadata.json new file mode 100644 index 00000000..27dcc68a --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "bigquery_dataset_cmk_encryption", + "CheckTitle": "Ensure BigQuery datasets are encrypted with Customer-Managed Keys (CMKs).", + "CheckType": [], + "ServiceName": "bigquery", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "Dataset", + "Description": "Ensure BigQuery datasets are encrypted with Customer-Managed Keys (CMKs) in order to have a more granular control over data encryption/decryption process.", + "Risk": "If you want to have greater control, Customer-managed encryption keys (CMEK) can be used as encryption key management solution for BigQuery Data Sets.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/enable-table-encryption-with-cmks.html", + "Terraform": "https://docs.bridgecrew.io/docs/ensure-gcp-big-query-tables-are-encrypted-with-customer-supplied-encryption-keys-csek-1#terraform" + }, + "Recommendation": { + "Text": "Encrypting datasets with Cloud KMS Customer-Managed Keys (CMKs) will allow for a more granular control over data encryption/decryption process.", + "Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.py b/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.py new file mode 100644 index 00000000..e5cc1b2c --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_dataset_cmk_encryption/bigquery_dataset_cmk_encryption.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client + + +class bigquery_dataset_cmk_encryption(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for dataset in bigquery_client.datasets: + report = Check_Report_GCP(self.metadata()) + report.project_id = bigquery_client.project_id + report.resource_id = dataset.id + report.resource_name = dataset.name + report.location = dataset.region + report.status = "PASS" + report.status_extended = ( + f"Dataset {dataset.name} is encrypted with Customer-Managed Keys (CMKs)" + ) + if not dataset.cmk_encryption: + report.status = "FAIL" + report.status_extended = f"Dataset {dataset.name} is not encrypted with Customer-Managed Keys (CMKs)" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/__init__.py b/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.metadata.json b/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.metadata.json new file mode 100644 index 00000000..228a73cc --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "bigquery_dataset_public_access", + "CheckTitle": "Ensure That BigQuery Datasets Are Not Anonymously or Publicly Accessible.", + "CheckType": [], + "ServiceName": "bigquery", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "Dataset", + "Description": "Ensure That BigQuery Datasets Are Not Anonymously or Publicly Accessible.", + "Risk": "Granting permissions to allUsers or allAuthenticatedUsers allows anyone to access the dataset. Such access might not be desirable if sensitive data is being stored in the dataset. Therefore, ensure that anonymous and/or public access to a dataset is not allowed.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/publicly-accessible-big-query-datasets.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_general_3#terraform" + }, + "Recommendation": { + "Text": "It is recommended that the IAM policy on BigQuery datasets does not allow anonymous and/or public access.", + "Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.py b/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.py new file mode 100644 index 00000000..c706a9a1 --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_dataset_public_access/bigquery_dataset_public_access.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client + + +class bigquery_dataset_public_access(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for dataset in bigquery_client.datasets: + report = Check_Report_GCP(self.metadata()) + report.project_id = bigquery_client.project_id + report.resource_id = dataset.id + report.resource_name = dataset.name + report.location = dataset.region + report.status = "PASS" + report.status_extended = f"Dataset {dataset.name} is publicly accessible!" + if not dataset.public: + report.status = "FAIL" + report.status_extended = ( + f"Dataset {dataset.name} is not publicly accessible" + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/bigquery/bigquery_service.py b/prowler/providers/gcp/services/bigquery/bigquery_service.py new file mode 100644 index 00000000..01e235dd --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_service.py @@ -0,0 +1,113 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## BigQuery +class BigQuery: + def __init__(self, audit_info): + self.service = "bigquery" + self.api_version = "v2" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.datasets = [] + self.tables = [] + self.__get_datasets__() + self.__get_tables__() + + def __get_datasets__(self): + try: + request = self.client.datasets().list(projectId=self.project_id) + while request is not None: + response = request.execute() + + for dataset in response.get("datasets", []): + dataset_info = ( + self.client.datasets() + .get( + projectId=self.project_id, + datasetId=dataset["datasetReference"]["datasetId"], + ) + .execute() + ) + cmk_encryption = False + public = False + roles = dataset_info.get("access", "") + if "allAuthenticatedUsers" in str(roles) or "allUsers" in str( + roles + ): + public = True + if dataset_info.get("defaultEncryptionConfiguration"): + cmk_encryption = True + self.datasets.append( + Dataset( + name=dataset["datasetReference"]["datasetId"], + id=dataset["id"], + region=dataset["location"], + cmk_encryption=cmk_encryption, + public=public, + ) + ) + + request = self.client.datasets().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_tables__(self): + try: + for dataset in self.datasets: + request = self.client.tables().list( + projectId=self.project_id, datasetId=dataset.name + ) + while request is not None: + response = request.execute() + + for table in response.get("tables", []): + cmk_encryption = False + if ( + self.client.tables() + .get( + projectId=self.project_id, + datasetId=dataset.name, + tableId=table["tableReference"]["tableId"], + ) + .execute() + .get("encryptionConfiguration") + ): + cmk_encryption = True + self.tables.append( + Table( + name=table["tableReference"]["tableId"], + id=table["id"], + region=dataset.region, + cmk_encryption=cmk_encryption, + ) + ) + + request = self.client.tables().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Dataset(BaseModel): + name: str + id: str + region: str + cmk_encryption: bool + public: bool + + +class Table(BaseModel): + name: str + id: str + region: str + cmk_encryption: bool diff --git a/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/__init__.py b/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.metadata.json b/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.metadata.json new file mode 100644 index 00000000..e3c3d748 --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "bigquery_table_cmk_encryption", + "CheckTitle": "Ensure BigQuery tables are encrypted with Customer-Managed Keys (CMKs).", + "CheckType": [], + "ServiceName": "bigquery", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "Table", + "Description": "Ensure BigQuery tables are encrypted with Customer-Managed Keys (CMKs) in order to have a more granular control over data encryption/decryption process.", + "Risk": "If you want to have greater control, Customer-managed encryption keys (CMEK) can be used as encryption key management solution for BigQuery Tables.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/BigQuery/enable-table-encryption-with-cmks.html", + "Terraform": "https://docs.bridgecrew.io/docs/ensure-gcp-big-query-tables-are-encrypted-with-customer-supplied-encryption-keys-csek#terraform" + }, + "Recommendation": { + "Text": "Encrypting tables with Cloud KMS Customer-Managed Keys (CMKs) will allow for a more granular control over data encryption/decryption process.", + "Url": "https://cloud.google.com/bigquery/docs/customer-managed-encryption" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.py b/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.py new file mode 100644 index 00000000..ea4e723b --- /dev/null +++ b/prowler/providers/gcp/services/bigquery/bigquery_table_cmk_encryption/bigquery_table_cmk_encryption.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.bigquery.bigquery_client import bigquery_client + + +class bigquery_table_cmk_encryption(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for table in bigquery_client.tables: + report = Check_Report_GCP(self.metadata()) + report.project_id = bigquery_client.project_id + report.resource_id = table.id + report.resource_name = table.name + report.location = table.region + report.status = "PASS" + report.status_extended = ( + f"Table {table.name} is encrypted with Customer-Managed Keys (CMKs)" + ) + if not table.cmk_encryption: + report.status = "FAIL" + report.status_extended = f"Table {table.name} is not encrypted with Customer-Managed Keys (CMKs)" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudresourcemanager/__init__.py b/prowler/providers/gcp/services/cloudresourcemanager/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_client.py b/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_client.py new file mode 100644 index 00000000..aaf574fc --- /dev/null +++ b/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_client.py @@ -0,0 +1,6 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.cloudresourcemanager.cloudresourcemanager_service import ( + CloudResourceManager, +) + +cloudresourcemanager_client = CloudResourceManager(gcp_audit_info) diff --git a/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_service.py b/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_service.py new file mode 100644 index 00000000..ef47bb39 --- /dev/null +++ b/prowler/providers/gcp/services/cloudresourcemanager/cloudresourcemanager_service.py @@ -0,0 +1,41 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## CloudResourceManager +class CloudResourceManager: + def __init__(self, audit_info): + self.service = "cloudresourcemanager" + self.api_version = "v1" + self.region = "global" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.bindings = [] + self.__get_iam_policy__() + + def __get_client__(self): + return self.client + + def __get_iam_policy__(self): + try: + policy = ( + self.client.projects().getIamPolicy(resource=self.project_id).execute() + ) + for binding in policy["bindings"]: + self.bindings.append( + Binding( + role=binding["role"], + members=binding["members"], + ) + ) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Binding(BaseModel): + role: str + members: list diff --git a/prowler/providers/gcp/services/cloudsql/__init__.py b/prowler/providers/gcp/services/cloudsql/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_client.py b/prowler/providers/gcp/services/cloudsql/cloudsql_client.py new file mode 100644 index 00000000..fa51c28f --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.cloudsql.cloudsql_service import CloudSQL + +cloudsql_client = CloudSQL(gcp_audit_info) diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.metadata.json new file mode 100644 index 00000000..43516e8a --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_automated_backups", + "CheckTitle": "Ensure That Cloud SQL Database Instances Are Configured With Automated Backups", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That Cloud SQL Database Instances Are Configured With Automated Backups", + "Risk": "Backups provide a way to restore a Cloud SQL instance to recover lost data or recover from a problem with that instance. Automated backups need to be set for any instance that contains data that should be protected from loss or damage. This recommendation is applicable for SQL Server, PostgreSql, MySql generation 1 and MySql generation 2 instances.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch --backup-start-time <[HH:MM]>", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-automated-backups.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to have all SQL database instances set to enable automated backups.", + "Url": "https://cloud.google.com/sql/docs/postgres/configure-ssl-instance/" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.py new file mode 100644 index 00000000..8ad0dd00 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_automated_backups/cloudsql_instance_automated_backups.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_automated_backups(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = ( + f"Database Instance {instance.name} has automated backups configured" + ) + if not instance.automated_backups: + report.status = "FAIL" + report.status_extended = f"Database Instance {instance.name} does not have automated backups configured" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.metadata.json new file mode 100644 index 00000000..348ba887 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_mysql_local_infile_flag", + "CheckTitle": "Ensure That the Local_infile Database Flag for a Cloud SQL MySQL Instance Is Set to Off", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That the Local_infile Database Flag for a Cloud SQL MySQL Instance Is Set to Off", + "Risk": "The local_infile flag controls the server-side LOCAL capability for LOAD DATA statements. Depending on the local_infile setting, the server refuses or permits local data loading by clients that have LOCAL enabled on the client side.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags local_infile=off", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-local-infile-flag.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_1#terraform" + }, + "Recommendation": { + "Text": "It is recommended to set the local_infile database flag for a Cloud SQL MySQL instance to off.", + "Url": "https://cloud.google.com/sql/docs/mysql/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.py new file mode 100644 index 00000000..da176ab7 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_local_infile_flag/cloudsql_instance_mysql_local_infile_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_mysql_local_infile_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "MYSQL" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"MySQL Instance {instance.name} has not 'local_infile' flag set to 'off'" + for flag in instance.flags: + if flag["name"] == "local_infile" and flag["value"] == "off": + report.status = "PASS" + report.status_extended = f"MySQL Instance {instance.name} has 'local_infile' flag set to 'off'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.metadata.json new file mode 100644 index 00000000..2d7b8764 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_mysql_skip_show_database_flag", + "CheckTitle": "Ensure Skip_show_database Database Flag for Cloud SQL MySQL Instance Is Set to On", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure Skip_show_database Database Flag for Cloud SQL MySQL Instance Is Set to On", + "Risk": "'skip_show_database' database flag prevents people from using the SHOW DATABASES statement if they do not have the SHOW DATABASES privilege.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags skip_show_database=on", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-skip-show-database-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to set skip_show_database database flag for Cloud SQL Mysql instance to on.", + "Url": "https://cloud.google.com/sql/docs/mysql/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.py new file mode 100644 index 00000000..47e1572b --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_mysql_skip_show_database_flag/cloudsql_instance_mysql_skip_show_database_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_mysql_skip_show_database_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "MYSQL" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"MySQL Instance {instance.name} has not 'skip_show_database' flag set to 'on'" + for flag in instance.flags: + if flag["name"] == "skip_show_database" and flag["value"] == "on": + report.status = "PASS" + report.status_extended = f"MySQL Instance {instance.name} has 'skip_show_database' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.metadata.json new file mode 100644 index 00000000..7ec83567 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_enable_pgaudit_flag", + "CheckTitle": "Ensure That 'cloudsql.enable_pgaudit' Database Flag for each Cloud Sql Postgresql Instance Is Set to 'on' For Centralized Logging", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That 'cloudsql.enable_pgaudit' Database Flag for each Cloud Sql Postgresql Instance Is Set to 'on' For Centralized Logging", + "Risk": "Ensure cloudsql.enable_pgaudit database flag for Cloud SQL PostgreSQL instance is set to on to allow for centralized logging.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags cloudsql.enable_pgaudit=On", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/postgre-sql-audit-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "As numerous other recommendations in this section consist of turning on flags for logging purposes, your organization will need a way to manage these logs. You may have a solution already in place. If you do not, consider installing and enabling the open source pgaudit extension within PostgreSQL and enabling its corresponding flag of cloudsql.enable_pgaudit. This flag and installing the extension enables database auditing in PostgreSQL through the open-source pgAudit extension. This extension provides detailed session and object logging to comply with government, financial, & ISO standards and provides auditing capabilities to mitigate threats by monitoring security events on the instance. Enabling the flag and settings later in this recommendation will send these logs to Google Logs Explorer so that you can access them in a central location.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.py new file mode 100644 index 00000000..d837768a --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_enable_pgaudit_flag/cloudsql_instance_postgres_enable_pgaudit_flag.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_enable_pgaudit_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'cloudsql.enable_pgaudit' flag set to 'on'" + for flag in instance.flags: + if ( + flag["name"] == "cloudsql.enable_pgaudit" + and flag["value"] == "on" + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'cloudsql.enable_pgaudit' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.metadata.json new file mode 100644 index 00000000..c8a93154 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_connections_flag", + "CheckTitle": "Ensure That the Log_connections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That the Log_connections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On", + "Risk": "Enabling the log_connections setting causes each attempted connection to the server to be logged, along with successful completion of client authentication. This parameter cannot be changed after the session starts.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_connections=on", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-log-connections-flag.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_3#terraform" + }, + "Recommendation": { + "Text": "PostgreSQL does not log attempted connections by default. Enabling the log_connections setting will create log entries for each attempted connection as well as successful completion of client authentication which can be useful in troubleshooting issues and to determine any unusual connection attempts to the server.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.py new file mode 100644 index 00000000..8a9bab63 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_connections_flag/cloudsql_instance_postgres_log_connections_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_connections_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_connections' flag set to 'on'" + for flag in instance.flags: + if flag["name"] == "log_connections" and flag["value"] == "on": + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_connections' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.metadata.json new file mode 100644 index 00000000..a4fe2cd4 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_disconnections_flag", + "CheckTitle": "Ensure That the log_disconnections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That the log_disconnections Database Flag for Cloud SQL PostgreSQL Instance Is Set to On", + "Risk": "PostgreSQL does not log session details such as duration and session end by default. Enabling the log_disconnections setting will create log entries at the end of each session which can be useful in troubleshooting issues and determine any unusual activity across a time period.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_disconnections=on", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-log-connections-flag.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_4#terraform" + }, + "Recommendation": { + "Text": "Enabling the log_disconnections setting logs the end of each session, including the session duration.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.py new file mode 100644 index 00000000..a63186e9 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_disconnections_flag/cloudsql_instance_postgres_log_disconnections_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_disconnections_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_disconnections' flag set to 'on'" + for flag in instance.flags: + if flag["name"] == "log_disconnections" and flag["value"] == "on": + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_disconnections' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.metadata.json new file mode 100644 index 00000000..5907f298 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_error_verbosity_flag", + "CheckTitle": "Ensure Log_error_verbosity Database Flag for Cloud SQL PostgreSQL Instance Is Set to DEFAULT or Stricter", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure Log_error_verbosity Database Flag for Cloud SQL PostgreSQL Instance Is Set to DEFAULT or Stricter", + "Risk": "The log_error_verbosity flag controls the verbosity/details of messages logged.TERSE excludes the logging of DETAIL, HINT, QUERY, and CONTEXT error information. VERBOSE output includes the SQLSTATE error code, source code file name, function name, and line number that generated the error. Ensure an appropriate value is set to 'DEFAULT' or stricter.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_error_verbosity=default", + "NativeIaC": "", + "Other": "", + "Terraform": "" + }, + "Recommendation": { + "Text": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_error_verbosity is not set to the correct value, too many details or too few details may be logged. This flag should be configured with a value of 'DEFAULT' or stricter. This recommendation is applicable to PostgreSQL database instances.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.py new file mode 100644 index 00000000..a72d53d4 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_error_verbosity_flag/cloudsql_instance_postgres_log_error_verbosity_flag.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_error_verbosity_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_error_verbosity' flag set to 'default'" + for flag in instance.flags: + if ( + flag["name"] == "log_error_verbosity" + and flag["value"] == "default" + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_error_verbosity' flag set to 'default'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.metadata.json new file mode 100644 index 00000000..62e2d165 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_min_duration_statement_flag", + "CheckTitle": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set to -1", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set to -1", + "Risk": "The log_min_duration_statement flag defines the minimum amount of execution time of a statement in milliseconds where the total duration of the statement is logged. Ensure that log_min_duration_statement is disabled, i.e., a value of -1 is set.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_duration_statement=-1", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-log-min-error-statement-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "Logging SQL statements may include sensitive information that should not be recorded in logs. This recommendation is applicable to PostgreSQL database instances.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.py new file mode 100644 index 00000000..299d232f --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_duration_statement_flag/cloudsql_instance_postgres_log_min_duration_statement_flag.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_min_duration_statement_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_duration_statement' flag set to '-1'" + for flag in instance.flags: + if ( + flag["name"] == "log_min_duration_statement" + and flag["value"] == "-1" + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_duration_statement' flag set to '-1'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.metadata.json new file mode 100644 index 00000000..570f1422 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_min_error_statement_flag", + "CheckTitle": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure that the Log_min_error_statement Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately", + "Risk": "The log_min_error_statement flag defines the minimum message severity level that are considered as an error statement. Messages for error statements are logged with the SQL statement. Valid values include DEBUG5, DEBUG4, DEBUG3, DEBUG2, DEBUG1, INFO, NOTICE, WARNING, ERROR, LOG, FATAL, and PANIC. Each severity level includes the subsequent levels mentioned above. Ensure a value of ERROR or stricter is set.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_error_statement=error", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-log-min-error-statement-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_min_error_statement is not set to the correct value, messages may not be classified as error messages appropriately. Considering general log messages as error messages would make is difficult to find actual errors and considering only stricter severity levels as error messages may skip actual errors to log their SQL statements. The log_min_error_statement flag should be set to ERROR or stricter.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.py new file mode 100644 index 00000000..f626a941 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_error_statement_flag/cloudsql_instance_postgres_log_min_error_statement_flag.py @@ -0,0 +1,28 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_min_error_statement_flag(Check): + def execute(self) -> Check_Report_GCP: + desired_log_min_error_statement = "error" + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_error_statement' flag set minimum to '{desired_log_min_error_statement}'" + for flag in instance.flags: + if ( + flag["name"] == "log_min_error_statement" + and flag["value"] == desired_log_min_error_statement + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_error_statement' flag set minimum to '{desired_log_min_error_statement}'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.metadata.json new file mode 100644 index 00000000..591d234b --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_min_messages_flag", + "CheckTitle": "Ensure that the Log_min_messages Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure that the Log_min_messages Flag for a Cloud SQL PostgreSQL Instance Is Set Appropriately", + "Risk": "Auditing helps in troubleshooting operational problems and also permits forensic analysis. If log_min_messages is not set to the correct value, messages may not be classified as error messages appropriately. An organization will need to decide their own threshold for logging log_min_messages flag.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_min_messages=warning", + "NativeIaC": "", + "Other": "", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_4#terraform" + }, + "Recommendation": { + "Text": "The log_min_messages flag defines the minimum message severity level that is considered as an error statement. Messages for error statements are logged with the SQL statement. Valid values include DEBUG5, DEBUG4, DEBUG3, DEBUG2, DEBUG1, INFO, NOTICE, WARNING, ERROR, LOG, FATAL, and PANIC. Each severity level includes the subsequent levels mentioned above. ERROR is considered the best practice setting. Changes should only be made in accordance with the organization's logging policy.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.py new file mode 100644 index 00000000..21a3ee37 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_min_messages_flag/cloudsql_instance_postgres_log_min_messages_flag.py @@ -0,0 +1,28 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_min_messages_flag(Check): + def execute(self) -> Check_Report_GCP: + desired_log_min_messages = "error" + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_min_messages' flag set minimum to '{desired_log_min_messages}'" + for flag in instance.flags: + if ( + flag["name"] == "log_min_messages" + and flag["value"] == desired_log_min_messages + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_min_messages' flag set minimum to '{desired_log_min_messages}'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.metadata.json new file mode 100644 index 00000000..21937a3e --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_postgres_log_statement_flag", + "CheckTitle": "Ensure That the Log_statement Database Flag for Cloud SQL PostgreSQL Instance Is Set Appropriately", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That the Log_statement Database Flag for Cloud SQL PostgreSQL Instance Is Set Appropriately", + "Risk": "Auditing helps in forensic analysis. If log_statement is not set to the correct value, too many statements may be logged leading to issues in finding the relevant information from the logs, or too few statements may be logged with relevant information missing from the logs.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags log_statement=ddl", + "NativeIaC": "", + "Other": "", + "Terraform": "" + }, + "Recommendation": { + "Text": "The value ddl logs all data definition statements. A value of 'ddl' is recommended unless otherwise directed by your organization's logging policy.", + "Url": "https://cloud.google.com/sql/docs/postgres/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.py new file mode 100644 index 00000000..7036f34d --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_postgres_log_statement_flag/cloudsql_instance_postgres_log_statement_flag.py @@ -0,0 +1,28 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_postgres_log_statement_flag(Check): + def execute(self) -> Check_Report_GCP: + desired_log_statement = "ddl" + findings = [] + for instance in cloudsql_client.instances: + if "POSTGRES" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "FAIL" + report.status_extended = f"PostgreSQL Instance {instance.name} has not 'log_statement' flag set to '{desired_log_statement}'" + for flag in instance.flags: + if ( + flag["name"] == "log_statement" + and flag["value"] == desired_log_statement + ): + report.status = "PASS" + report.status_extended = f"PostgreSQL Instance {instance.name} has 'log_statement' flag set to '{desired_log_statement}'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.metadata.json new file mode 100644 index 00000000..b7c4d1be --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_private_ip_assignment", + "CheckTitle": "Ensure Instance IP assignment is set to private", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure Instance IP assignment is set to private", + "Risk": "Instance addresses can be public IP or private IP. Public IP means that the instance is accessible through the public internet. In contrast, instances using only private IP are not accessible through the public internet, but are accessible through a Virtual Private Cloud (VPC). Limiting network access to your database will limit potential attacks.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "", + "Terraform": "" + }, + "Recommendation": { + "Text": "Setting databases access only to private will reduce attack surface.", + "Url": "https://cloud.google.com/sql/docs/mysql/configure-private-ip" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.py new file mode 100644 index 00000000..9fb7307e --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_private_ip_assignment/cloudsql_instance_private_ip_assignment.py @@ -0,0 +1,25 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_private_ip_assignment(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"Database Instance {instance.name} does not have private IP assignments" + for address in instance.ip_addresses: + if address["type"] != "PRIVATE": + report.status = "FAIL" + report.status_extended = ( + f"Database Instance {instance.name} has public IP assignments" + ) + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.metadata.json new file mode 100644 index 00000000..59b07dcb --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_public_access", + "CheckTitle": "Ensure That Cloud SQL Database Instances Do Not Implicitly Whitelist All Public IP Addresses ", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That Cloud SQL Database Instances Do Not Implicitly Whitelist All Public IP Addresses ", + "Risk": "To minimize attack surface on a Database server instance, only trusted/known and required IP(s) should be white-listed to connect to it. An authorized network should not have IPs/networks configured to 0.0.0.0/0 which will allow access to the instance from anywhere in the world. Note that authorized networks apply only to instances with public IPs.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch --authorized-networks=IP_ADDR1,IP_ADDR2...", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/publicly-accessible-cloud-sql-instances.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "Database Server should accept connections only from trusted Network(s)/IP(s) and restrict access from public IP addresses.", + "Url": "https://cloud.google.com/sql/docs/mysql/connection-org-policy" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.py new file mode 100644 index 00000000..abc75f79 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_access/cloudsql_instance_public_access.py @@ -0,0 +1,22 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_public_access(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"Database Instance {instance.name} does not whitelist all Public IP Addresses" + for network in instance.authorized_networks: + if network["value"] == "0.0.0.0/0": + report.status = "FAIL" + report.status_extended = f"Database Instance {instance.name} whitelist all Public IP Addresses" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.metadata.json new file mode 100644 index 00000000..28f0e3c1 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_public_ip", + "CheckTitle": "Check for Cloud SQL Database Instances with Public IPs", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Check for Cloud SQL Database Instances with Public IPs", + "Risk": "To lower the organization's attack surface, Cloud SQL databases should not have public IPs. Private IPs provide improved network security and lower latency for your application.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/sql-database-instances-with-public-ips.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_11#terraform" + }, + "Recommendation": { + "Text": "To lower the organization's attack surface, Cloud SQL databases should not have public IPs. Private IPs provide improved network security and lower latency for your application.", + "Url": "https://cloud.google.com/sql/docs/mysql/configure-private-ip" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.py new file mode 100644 index 00000000..236e32f6 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_public_ip/cloudsql_instance_public_ip.py @@ -0,0 +1,25 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_public_ip(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = ( + f"Database Instance {instance.name} has not a public IP" + ) + if instance.public_ip: + report.status = "FAIL" + report.status_extended = ( + f"Database Instance {instance.name} has a public IP" + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.metadata.json new file mode 100644 index 00000000..3071cf84 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_contained_database_authentication_flag", + "CheckTitle": "Ensure that the 'contained database authentication' database flag for Cloud SQL on the SQL Server instance is set to 'off' ", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure that the 'contained database authentication' database flag for Cloud SQL on the SQL Server instance is set to 'off' ", + "Risk": "A contained database includes all database settings and metadata required to define the database and has no configuration dependencies on the instance of the Database Engine where the database is installed. Users can connect to the database without authenticating a login at the Database Engine level. Isolating the database from the Database Engine makes it possible to easily move the database to another instance of SQL Server. Contained databases have some unique threats that should be understood and mitigated by SQL Server Database Engine administrators. Most of the threats are related to the USER WITH PASSWORD authentication process, which moves the authentication boundary from the Database Engine level to the database level, hence this is recommended to disable this flag. This recommendation is applicable to SQL Server database instances.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch --database-flags contained database authentication=off", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-contained-database-authentication-flag.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_sql_10#terraform" + }, + "Recommendation": { + "Text": "It is recommended to set contained database authentication database flag for Cloud SQL on the SQL Server instance to off.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.py new file mode 100644 index 00000000..11f0b3c2 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_contained_database_authentication_flag/cloudsql_instance_sqlserver_contained_database_authentication_flag.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_contained_database_authentication_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has 'contained database authentication' flag set to 'off'" + for flag in instance.flags: + if ( + flag["name"] == "contained database authentication" + and flag["value"] == "on" + ): + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has 'contained database authentication' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.metadata.json new file mode 100644 index 00000000..c597df28 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag", + "CheckTitle": "Ensure that the 'cross db ownership chaining' database flag for Cloud SQL SQL Server instance is set to 'off'", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure that the 'cross db ownership chaining' database flag for Cloud SQL SQL Server instance is set to 'off'", + "Risk": "Use the cross db ownership for chaining option to configure cross-database ownership chaining for an instance of Microsoft SQL Server. This server option allows you to control cross-database ownership chaining at the database level or to allow cross- database ownership chaining for all databases. Enabling cross db ownership is not recommended unless all of the databases hosted by the instance of SQL Server must participate in cross-database ownership chaining and you are aware of the security implications of this setting.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags cross db ownership=off", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-cross-db-ownership-chaining-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to set cross db ownership chaining database flag for Cloud SQL SQL Server instance to off.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.py new file mode 100644 index 00000000..3e117500 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag/cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_cross_db_ownership_chaining_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has 'cross db ownership' flag set to 'off'" + for flag in instance.flags: + if flag["name"] == "cross db ownership" and flag["value"] == "on": + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has not 'cross db ownership' flag set to 'off'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.metadata.json new file mode 100644 index 00000000..e2bfea28 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_external_scripts_enabled_flag", + "CheckTitle": "Ensure 'external scripts enabled' database flag for Cloud SQL SQL Server instance is set to 'off'", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "DatabaseInstance", + "Description": "Ensure 'external scripts enabled' database flag for Cloud SQL SQL Server instance is set to 'off'", + "Risk": "external scripts enabled enable the execution of scripts with certain remote language extensions. This property is OFF by default. When Advanced Analytics Services is installed, setup can optionally set this property to true. As the External Scripts Enabled feature allows scripts external to SQL such as files located in an R library to be executed, which could adversely affect the security of the system, hence this should be disabled.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags external scripts enabled=off", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-external-scripts-enabled-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to set external scripts enabled database flag for Cloud SQL SQL Server instance to off", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.py new file mode 100644 index 00000000..c3a4735b --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_external_scripts_enabled_flag/cloudsql_instance_sqlserver_external_scripts_enabled_flag.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_external_scripts_enabled_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has 'external scripts enabled' flag set to 'off'" + for flag in instance.flags: + if ( + flag["name"] == "external scripts enabled" + and flag["value"] == "on" + ): + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has not 'external scripts enabled' flag set to 'off'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.metadata.json new file mode 100644 index 00000000..4b7b6a8f --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_remote_access_flag", + "CheckTitle": "Ensure 'remote access' database flag for Cloud SQL SQL Server instance is set to 'off'", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "DatabaseInstance", + "Description": "Ensure 'remote access' database flag for Cloud SQL SQL Server instance is set to 'off'", + "Risk": "The remote access option controls the execution of stored procedures from local or remote servers on which instances of SQL Server are running. This default value for this option is 1. This grants permission to run local stored procedures from remote servers or remote stored procedures from the local server. To prevent local stored procedures from being run from a remote server or remote stored procedures from being run on the local server, this must be disabled. The Remote Access option controls the execution of local stored procedures on remote servers or remote stored procedures on local server. 'Remote access' functionality can be abused to launch a Denial-of- Service (DoS) attack on remote servers by off-loading query processing to a target, hence this should be disabled. This recommendation is applicable to SQL Server database instances.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-remote-access-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to set remote access database flag for Cloud SQL SQL Server instance to off.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.py new file mode 100644 index 00000000..bbe7e0a0 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_remote_access_flag/cloudsql_instance_sqlserver_remote_access_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_remote_access_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has not 'remote access' flag set to 'on'" + for flag in instance.flags: + if flag["name"] == "remote access" and flag["value"] == "on": + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has 'remote access' flag set to 'on'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.metadata.json new file mode 100644 index 00000000..47297dae --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_trace_flag", + "CheckTitle": "Ensure '3625 (trace flag)' database flag for all Cloud SQL Server instances is set to 'on' ", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure '3625 (trace flag)' database flag for all Cloud SQL Server instances is set to 'on' ", + "Risk": "Microsoft SQL Trace Flags are frequently used to diagnose performance issues or to debug stored procedures or complex computer systems, but they may also be recommended by Microsoft Support to address behavior that is negatively impacting a specific workload. All documented trace flags and those recommended by Microsoft Support are fully supported in a production environment when used as directed. 3625(trace log) Limits the amount of information returned to users who are not members of the sysadmin fixed server role, by masking the parameters of some error messages using '******'. Setting this in a Google Cloud flag for the instance allows for security through obscurity and prevents the disclosure of sensitive information, hence this is recommended to set this flag globally to on to prevent the flag having been left off, or changed by bad actors. This recommendation is applicable to SQL Server database instances.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch --database-flags 3625=on", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/disable-3625-trace-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to set 3625 (trace flag) database flag for Cloud SQL SQL Server instance to on.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.py new file mode 100644 index 00000000..c89ab23a --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_trace_flag/cloudsql_instance_sqlserver_trace_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_trace_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has '3625 (trace flag)' flag set to 'on'" + for flag in instance.flags: + if flag["name"] == "3625" and flag["value"] == "off": + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has '3625 (trace flag)' flag set to 'off'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.metadata.json new file mode 100644 index 00000000..9a730126 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_user_connections_flag", + "CheckTitle": "Ensure 'user Connections' Database Flag for Cloud Sql Sql Server Instance Is Set to a Non-limiting Value", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure 'user Connections' Database Flag for Cloud Sql Sql Server Instance Is Set to a Non-limiting Value", + "Risk": "The user connections option specifies the maximum number of simultaneous user connections that are allowed on an instance of SQL Server. The actual number of user connections allowed also depends on the version of SQL Server that you are using, and also the limits of your application or applications and hardware. SQL Server allows a maximum of 32,767 user connections. Because user connections is by default a self- configuring value, with SQL Server adjusting the maximum number of user connections automatically as needed, up to the maximum value allowable. For example, if only 10 users are logged in, 10 user connection objects are allocated. In most cases, you do not have to change the value for this option. The default is 0, which means that the maximum (32,767) user connections are allowed. However if there is a number defined here that limits connections, SQL Server will not allow anymore above this limit. If the connections are at the limit, any new requests will be dropped, potentially causing lost data or outages for those using the database.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch INSTANCE_NAME --database-flags user connections=0", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/configure-user-connection-flag.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to check the user connections for a Cloud SQL SQL Server instance to ensure that it is not artificially limiting connections.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.py new file mode 100644 index 00000000..ef892b91 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_connections_flag/cloudsql_instance_sqlserver_user_connections_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_user_connections_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has 'user connections' flag set to '0'" + for flag in instance.flags: + if flag["name"] == "user connections" and flag["value"] == "0": + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has not 'user connections' flag set to '0'" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.metadata.json new file mode 100644 index 00000000..24737f76 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_sqlserver_user_options_flag", + "CheckTitle": "Ensure 'user options' database flag for Cloud SQL SQL Server instance is not configured", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure 'user options' database flag for Cloud SQL SQL Server instance is not configured", + "Risk": "The user options option specifies global defaults for all users. A list of default query processing options is established for the duration of a user's work session. The user options option allows you to change the default values of the SET options (if the server's default settings are not appropriate). A user can override these defaults by using the SET statement. You can configure user options dynamically for new logins. After you change the setting of user options, new login sessions use the new setting; current login sessions are not affected.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/user-options-flag-not-configured.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that, user options database flag for Cloud SQL SQL Server instance should not be configured.", + "Url": "https://cloud.google.com/sql/docs/sqlserver/flags" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.py new file mode 100644 index 00000000..26466603 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_sqlserver_user_options_flag/cloudsql_instance_sqlserver_user_options_flag.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_sqlserver_user_options_flag(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + if "SQLSERVER" in instance.version: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = f"SQL Server Instance {instance.name} has not 'user options' flag set" + for flag in instance.flags: + if flag["name"] == "user options" and flag["value"] != "": + report.status = "FAIL" + report.status_extended = f"SQL Server Instance {instance.name} has 'user options' flag set" + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/__init__.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.metadata.json b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.metadata.json new file mode 100644 index 00000000..c4c43724 --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudsql_instance_ssl_connections", + "CheckTitle": "Ensure That the Cloud SQL Database Instance Requires All Incoming Connections To Use SSL", + "CheckType": [], + "ServiceName": "cloudsql", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "DatabaseInstance", + "Description": "Ensure That the Cloud SQL Database Instance Requires All Incoming Connections To Use SSL", + "Risk": "SQL database connections if successfully trapped (MITM); can reveal sensitive data like credentials, database queries, query outputs etc. For security, it is recommended to always use SSL encryption when connecting to your instance. This recommendation is applicable for Postgresql, MySql generation 1, MySql generation 2 and SQL Server 2017 instances.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud sql instances patch --require-ssl", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudSQL/enable-ssl-for-incoming-connections.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to enforce all incoming connections to SQL database instance to use SSL.", + "Url": "https://cloud.google.com/sql/docs/postgres/configure-ssl-instance/" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.py b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.py new file mode 100644 index 00000000..8a798a8d --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_instance_ssl_connections/cloudsql_instance_ssl_connections.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudsql.cloudsql_client import cloudsql_client + + +class cloudsql_instance_ssl_connections(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in cloudsql_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudsql_client.project_id + report.resource_id = instance.name + report.resource_name = instance.name + report.location = instance.region + report.status = "PASS" + report.status_extended = ( + f"Database Instance {instance.name} requires SSL connections" + ) + if not instance.ssl: + report.status = "FAIL" + report.status_extended = f"Database Instance {instance.name} does not require SSL connections" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudsql/cloudsql_service.py b/prowler/providers/gcp/services/cloudsql/cloudsql_service.py new file mode 100644 index 00000000..947cac8e --- /dev/null +++ b/prowler/providers/gcp/services/cloudsql/cloudsql_service.py @@ -0,0 +1,66 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## CloudSQL +class CloudSQL: + def __init__(self, audit_info): + self.service = "sqladmin" + self.api_version = "v1" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.instances = [] + self.__get_instances__() + + def __get_instances__(self): + try: + request = self.client.instances().list(project=self.project_id) + while request is not None: + response = request.execute() + + for instance in response.get("items", []): + public_ip = False + for address in instance.get("ipAddresses", []): + if address["type"] == "PRIMARY": + public_ip = True + self.instances.append( + Instance( + name=instance["name"], + version=instance["databaseVersion"], + region=instance["region"], + ip_addresses=instance.get("ipAddresses", []), + public_ip=public_ip, + ssl=instance["settings"]["ipConfiguration"].get( + "requireSsl", False + ), + automated_backups=instance["settings"][ + "backupConfiguration" + ]["enabled"], + authorized_networks=instance["settings"]["ipConfiguration"][ + "authorizedNetworks" + ], + flags=instance["settings"].get("databaseFlags", []), + ) + ) + + request = self.client.instances().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Instance(BaseModel): + name: str + version: str + ip_addresses: list + region: str + public_ip: bool + authorized_networks: list + ssl: bool + automated_backups: bool + flags: list diff --git a/prowler/providers/gcp/services/cloudstorage/__init__.py b/prowler/providers/gcp/services/cloudstorage/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/__init__.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.metadata.json b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.metadata.json new file mode 100644 index 00000000..af50d17e --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudstorage_bucket_public_access", + "CheckTitle": "Ensure That Cloud Storage Bucket Is Not Anonymously or Publicly Accessible", + "CheckType": [], + "ServiceName": "cloudstorage", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "Bucket", + "Description": "Ensure That Cloud Storage Bucket Is Not Anonymously or Publicly Accessible", + "Risk": "Allowing anonymous or public access grants permissions to anyone to access bucket content. Such access might not be desired if you are storing any sensitive data. Hence, ensure that anonymous or public access to a bucket is not allowed.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_public_1#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudStorage/publicly-accessible-storage-buckets.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_public_1#terraform" + }, + "Recommendation": { + "Text": "It is recommended that IAM policy on Cloud Storage bucket does not allows anonymous or public access.", + "Url": "https://cloud.google.com/storage/docs/access-control/iam-reference" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.py new file mode 100644 index 00000000..b55f99cf --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_public_access/cloudstorage_bucket_public_access.py @@ -0,0 +1,23 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudstorage.cloudstorage_client import ( + cloudstorage_client, +) + + +class cloudstorage_bucket_public_access(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for bucket in cloudstorage_client.buckets: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudstorage_client.project_id + report.resource_id = bucket.id + report.resource_name = bucket.name + report.location = bucket.region + report.status = "PASS" + report.status_extended = f"Bucket {bucket.name} is not publicly accessible" + if bucket.public: + report.status = "FAIL" + report.status_extended = f"Bucket {bucket.name} is publicly accessible!" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/__init__.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.metadata.json b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.metadata.json new file mode 100644 index 00000000..22a2d42f --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "cloudstorage_bucket_uniform_bucket_level_access", + "CheckTitle": "Ensure That Cloud Storage Buckets Have Uniform Bucket-Level Access Enabled", + "CheckType": [], + "ServiceName": "cloudstorage", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "Bucket", + "Description": "Ensure That Cloud Storage Buckets Have Uniform Bucket-Level Access Enabled", + "Risk": "Enabling uniform bucket-level access guarantees that if a Storage bucket is not publicly accessible, no object in the bucket is publicly accessible either.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gsutil uniformbucketlevelaccess set on gs://BUCKET_NAME/", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudStorage/enable-uniform-bucket-level-access.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_gcs_2#terraform" + }, + "Recommendation": { + "Text": "It is recommended that uniform bucket-level access is enabled on Cloud Storage buckets.", + "Url": "https://cloud.google.com/storage/docs/using-uniform-bucket-level-access" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.py new file mode 100644 index 00000000..3834ea9d --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_bucket_uniform_bucket_level_access/cloudstorage_bucket_uniform_bucket_level_access.py @@ -0,0 +1,27 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudstorage.cloudstorage_client import ( + cloudstorage_client, +) + + +class cloudstorage_bucket_uniform_bucket_level_access(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for bucket in cloudstorage_client.buckets: + report = Check_Report_GCP(self.metadata()) + report.project_id = cloudstorage_client.project_id + report.resource_id = bucket.id + report.resource_name = bucket.name + report.location = bucket.region + report.status = "PASS" + report.status_extended = ( + f"Bucket {bucket.name} has uniform Bucket Level Access enabled" + ) + if not bucket.uniform_bucket_level_access: + report.status = "FAIL" + report.status_extended = ( + f"Bucket {bucket.name} has uniform Bucket Level Access disabled" + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_client.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_client.py new file mode 100644 index 00000000..aca1c82b --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_client.py @@ -0,0 +1,6 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.cloudstorage.cloudstorage_service import ( + CloudStorage, +) + +cloudstorage_client = CloudStorage(gcp_audit_info) diff --git a/prowler/providers/gcp/services/cloudstorage/cloudstorage_service.py b/prowler/providers/gcp/services/cloudstorage/cloudstorage_service.py new file mode 100644 index 00000000..f78bbfd9 --- /dev/null +++ b/prowler/providers/gcp/services/cloudstorage/cloudstorage_service.py @@ -0,0 +1,59 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## CloudStorage +class CloudStorage: + def __init__(self, audit_info): + self.service = "storage" + self.api_version = "v1" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.buckets = [] + self.__get_buckets__() + + def __get_buckets__(self): + try: + request = self.client.buckets().list(project=self.project_id) + while request is not None: + response = request.execute() + for bucket in response.get("items", []): + bucket_iam = ( + self.client.buckets() + .getIamPolicy(bucket=bucket["id"]) + .execute()["bindings"] + ) + public = False + if "allAuthenticatedUsers" in str(bucket_iam) or "allUsers" in str( + bucket_iam + ): + public = True + self.buckets.append( + Bucket( + name=bucket["name"], + id=bucket["id"], + region=bucket["location"], + uniform_bucket_level_access=bucket["iamConfiguration"][ + "uniformBucketLevelAccess" + ]["enabled"], + public=public, + ) + ) + + request = self.client.buckets().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Bucket(BaseModel): + name: str + id: str + region: str + uniform_bucket_level_access: bool + public: bool diff --git a/prowler/providers/gcp/services/compute/__init__.py b/prowler/providers/gcp/services/compute/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/compute/compute_client.py b/prowler/providers/gcp/services/compute/compute_client.py new file mode 100644 index 00000000..93354f39 --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.compute.compute_service import Compute + +compute_client = Compute(gcp_audit_info) diff --git a/prowler/providers/gcp/services/compute/compute_instance_public_ip/__init__.py b/prowler/providers/gcp/services/compute/compute_instance_public_ip/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.metadata.json b/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.metadata.json new file mode 100644 index 00000000..e6952b88 --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "compute_instance_public_ip", + "CheckTitle": "Check for Virtual Machine Instances with Public IP Addresses", + "CheckType": [], + "ServiceName": "compute", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "VMInstance", + "Description": "Check for Virtual Machine Instances with Public IP Addresses", + "Risk": "To reduce your attack surface, Compute instances should not have public IP addresses. Instead, instances should be configured behind load balancers, to minimize the instance's exposure to the internet.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_public_2#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/Compute/instances-with-public-ip-addresses.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_public_2#terraform" + }, + "Recommendation": { + "Text": "Ensure that your Google Compute Engine instances are not configured to have external IP addresses in order to minimize their exposure to the Internet.", + "Url": "https://cloud.google.com/compute/docs/instances/connecting-to-instance" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.py b/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.py new file mode 100644 index 00000000..129489dd --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_instance_public_ip/compute_instance_public_ip.py @@ -0,0 +1,21 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.compute.compute_client import compute_client + + +class compute_instance_public_ip(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for instance in compute_client.instances: + report = Check_Report_GCP(self.metadata()) + report.project_id = compute_client.project_id + report.resource_id = instance.id + report.resource_name = instance.name + report.location = instance.zone + report.status = "PASS" + report.status_extended = f"VM Instance {instance.name} has not a public IP" + if instance.public_ip: + report.status = "FAIL" + report.status_extended = f"VM Instance {instance.name} has a public IP" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/compute/compute_network_default_in_use/__init__.py b/prowler/providers/gcp/services/compute/compute_network_default_in_use/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.metadata.json b/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.metadata.json new file mode 100644 index 00000000..544a80b6 --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "compute_network_default_in_use", + "CheckTitle": "Ensure that the default network does not exist", + "CheckType": [], + "ServiceName": "compute", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "Network", + "Description": "Ensure that the default network does not exist", + "Risk": "The default network has a preconfigured network configuration and automatically generates insecure firewall rules.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_networking_7#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudVPC/default-vpc-in-use.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_networking_7#terraform" + }, + "Recommendation": { + "Text": "When an organization deletes the default network, it may need to migrate or service onto a new network.", + "Url": "https://cloud.google.com/vpc/docs/using-vpc" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.py b/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.py new file mode 100644 index 00000000..402790cb --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_network_default_in_use/compute_network_default_in_use.py @@ -0,0 +1,22 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.compute.compute_client import compute_client + + +class compute_network_default_in_use(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = compute_client.project_id + report.resource_id = "default" + report.resource_name = "default" + report.location = "global" + report.status = "PASS" + report.status_extended = "Default network does not exist" + for network in compute_client.networks: + if network.name == "default": + report.status = "FAIL" + report.status_extended = "Default network is in use" + + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/compute/compute_service.py b/prowler/providers/gcp/services/compute/compute_service.py new file mode 100644 index 00000000..b7725576 --- /dev/null +++ b/prowler/providers/gcp/services/compute/compute_service.py @@ -0,0 +1,102 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## Compute +class Compute: + def __init__(self, audit_info): + self.service = "compute" + self.api_version = "v1" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.zones = [] + self.instances = [] + self.networks = [] + self.__get_zones__() + self.__get_instances__() + self.__get_networks__() + + def __get_zones__(self): + try: + request = self.client.zones().list(project=self.project_id) + while request is not None: + response = request.execute() + + for zone in response.get("items", []): + self.zones.append(zone["name"]) + + request = self.client.zones().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_instances__(self): + try: + for zone in self.zones: + request = self.client.instances().list( + project=self.project_id, zone=zone + ) + while request is not None: + response = request.execute() + + for instance in response.get("items", []): + public_ip = False + for interface in instance["networkInterfaces"]: + for config in interface.get("accessConfigs", []): + if "natIP" in config: + public_ip = True + self.instances.append( + Instance( + name=instance["name"], + id=instance["id"], + zone=zone, + public_ip=public_ip, + ) + ) + + request = self.client.instances().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{zone} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_networks__(self): + try: + request = self.client.networks().list(project=self.project_id) + while request is not None: + response = request.execute() + + for network in response.get("items", []): + self.networks.append( + Network( + name=network["name"], + id=network["id"], + ) + ) + + request = self.client.networks().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Instance(BaseModel): + name: str + id: str + zone: str + public_ip: bool + + +class Network(BaseModel): + name: str + id: str diff --git a/prowler/providers/gcp/services/iam/__init__.py b/prowler/providers/gcp/services/iam/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/iam/iam_client.py b/prowler/providers/gcp/services/iam/iam_client.py new file mode 100644 index 00000000..0752df0d --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.iam.iam_service import IAM + +iam_client = IAM(gcp_audit_info) diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/__init__.py b/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.metadata.json b/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.metadata.json new file mode 100644 index 00000000..f8e71cd8 --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "iam_sa_no_administrative_privileges", + "CheckTitle": "Ensure Service Account does not have admin privileges", + "CheckType": [], + "ServiceName": "iam", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "ServiceAccount", + "Description": "Ensure Service Account does not have admin privileges", + "Risk": "Enrolling ServiceAccount with Admin rights gives full access to an assigned application or a VM. A ServiceAccount Access holder can perform critical actions, such as delete and update change settings, without user intervention.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_iam_4#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudIAM/restrict-admin-access-for-service-accounts.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_iam_4#terraform" + }, + "Recommendation": { + "Text": "Ensure that your Google Cloud user-managed service accounts are not using privileged (administrator) roles, in order to implement the principle of least privilege and prevent any accidental or intentional modifications that may lead to data leaks and/or data loss.", + "Url": "https://cloud.google.com/iam/docs/manage-access-service-accounts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.py b/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.py new file mode 100644 index 00000000..632688ea --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_no_administrative_privileges/iam_sa_no_administrative_privileges.py @@ -0,0 +1,31 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.cloudresourcemanager.cloudresourcemanager_client import ( + cloudresourcemanager_client, +) +from prowler.providers.gcp.services.iam.iam_client import iam_client + + +class iam_sa_no_administrative_privileges(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for account in iam_client.service_accounts: + report = Check_Report_GCP(self.metadata()) + report.project_id = iam_client.project_id + report.resource_id = account.email + report.resource_name = account.display_name + report.location = iam_client.region + report.status = "PASS" + report.status_extended = ( + f"Account {account.email} has no administrative privileges" + ) + for binding in cloudresourcemanager_client.bindings: + if f"serviceAccount:{account.email}" in binding.members and ( + "admin" in binding.role.lower() + or "owner" in binding.role.lower() + or "editor" in binding.role.lower() + ): + report.status = "FAIL" + report.status_extended = f"Account {account.email} has administrative privileges with {binding.role}" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/__init__.py b/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.metadata.json b/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.metadata.json new file mode 100644 index 00000000..14a507b0 --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "iam_sa_no_user_managed_keys", + "CheckTitle": "Ensure That There Are Only GCP-Managed Service Account Keys for Each Service Account", + "CheckType": [], + "ServiceName": "iam", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "ServiceAccountKey", + "Description": "Ensure That There Are Only GCP-Managed Service Account Keys for Each Service Account", + "Risk": "Anyone who has access to the keys will be able to access resources through the service account. GCP-managed keys are used by Cloud Platform services such as App Engine and Compute Engine. These keys cannot be downloaded. Google will keep the keys and automatically rotate them on an approximately weekly basis. User-managed keys are created, downloadable, and managed by users.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudIAM/delete-user-managed-service-account-keys.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to prevent user-managed service account keys.", + "Url": "https://cloud.google.com/iam/docs/creating-managing-service-account-keys" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.py b/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.py new file mode 100644 index 00000000..81bd73c9 --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_no_user_managed_keys/iam_sa_no_user_managed_keys.py @@ -0,0 +1,26 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.iam.iam_client import iam_client + + +class iam_sa_no_user_managed_keys(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for account in iam_client.service_accounts: + report = Check_Report_GCP(self.metadata()) + report.project_id = iam_client.project_id + report.resource_id = account.email + report.resource_name = account.display_name + report.location = iam_client.region + report.status = "PASS" + report.status_extended = ( + f"Account {account.email} does not have user-managed keys." + ) + for key in account.keys: + if key.type == "USER_MANAGED": + report.status = "FAIL" + report.status_extended = ( + f"Account {account.email} has user-managed keys." + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/__init__.py b/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.metadata.json b/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.metadata.json new file mode 100644 index 00000000..bf0e2e6d --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "iam_sa_user_managed_key_rotate_90_days", + "CheckTitle": "Ensure User-Managed/External Keys for Service Accounts Are Rotated Every 90 Days", + "CheckType": [], + "ServiceName": "iam", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "low", + "ResourceType": "ServiceAccountKey", + "Description": "Ensure User-Managed/External Keys for Service Accounts Are Rotated Every 90 Days", + "Risk": "Service Account keys should be rotated to ensure that data cannot be accessed with an old key that might have been lost, cracked, or stolen.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudIAM/rotate-service-account-user-managed-keys.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that all Service Account keys are regularly rotated.", + "Url": "https://cloud.google.com/iam/docs/creating-managing-service-account-keys" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.py b/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.py new file mode 100644 index 00000000..eaed742e --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_rotate_90_days/iam_sa_user_managed_key_rotate_90_days.py @@ -0,0 +1,26 @@ +from datetime import datetime + +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.iam.iam_client import iam_client + + +class iam_sa_user_managed_key_rotate_90_days(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for account in iam_client.service_accounts: + for key in account.keys: + if key.type == "USER_MANAGED": + last_rotated = (datetime.now() - key.valid_after).days + report = Check_Report_GCP(self.metadata()) + report.project_id = iam_client.project_id + report.resource_id = key.name + report.resource_name = account.email + report.location = iam_client.region + report.status = "PASS" + report.status_extended = f"User-managed key {key.name} for account {account.email} was rotated over the last 90 days ({last_rotated} days ago)" + if last_rotated > 90: + report.status = "FAIL" + report.status_extended = f"User-managed key {key.name} for account {account.email} was not rotated over the last 90 days ({last_rotated} days ago)" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/iam/iam_service.py b/prowler/providers/gcp/services/iam/iam_service.py new file mode 100644 index 00000000..c5d12d37 --- /dev/null +++ b/prowler/providers/gcp/services/iam/iam_service.py @@ -0,0 +1,102 @@ +from datetime import datetime + +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## IAM +class IAM: + def __init__(self, audit_info): + self.service = "iam" + self.api_version = "v1" + self.project_id = audit_info.project_id + self.region = "global" + self.client = generate_client(self.service, self.api_version, audit_info) + self.service_accounts = [] + self.__get_service_accounts__() + self.__get_service_accounts_keys__() + + def __get_client__(self): + return self.client + + def __get_service_accounts__(self): + try: + request = ( + self.client.projects() + .serviceAccounts() + .list(name="projects/" + self.project_id) + ) + while request is not None: + response = request.execute() + + for account in response["accounts"]: + self.service_accounts.append( + ServiceAccount( + name=account["name"], + email=account["email"], + display_name=account["displayName"], + ) + ) + + request = ( + self.client.projects() + .serviceAccounts() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_service_accounts_keys__(self): + try: + for sa in self.service_accounts: + request = ( + self.client.projects() + .serviceAccounts() + .keys() + .list( + name="projects/" + + self.project_id + + "/serviceAccounts/" + + sa.email + ) + ) + response = request.execute() + + for key in response["keys"]: + sa.keys.append( + Key( + name=key["name"].split("/")[-1], + origin=key["keyOrigin"], + type=key["keyType"], + valid_after=datetime.strptime( + key["validAfterTime"], "%Y-%m-%dT%H:%M:%SZ" + ), + valid_before=datetime.strptime( + key["validBeforeTime"], "%Y-%m-%dT%H:%M:%SZ" + ), + ) + ) + + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Key(BaseModel): + name: str + origin: str + type: str + valid_after: datetime + valid_before: datetime + + +class ServiceAccount(BaseModel): + name: str + email: str + display_name: str + keys: list[Key] = [] diff --git a/prowler/providers/gcp/services/kms/__init__.py b/prowler/providers/gcp/services/kms/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/kms/kms_client.py b/prowler/providers/gcp/services/kms/kms_client.py new file mode 100644 index 00000000..cd51f344 --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.kms.kms_service import KMS + +kms_client = KMS(gcp_audit_info) diff --git a/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/__init__.py b/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.metadata.json b/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.metadata.json new file mode 100644 index 00000000..4527f67c --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "kms_key_not_publicly_accessible", + "CheckTitle": "Check for Publicly Accessible Cloud KMS Keys", + "CheckType": [], + "ServiceName": "kms", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "high", + "ResourceType": "CryptoKey", + "Description": "Check for Publicly Accessible Cloud KMS Keys", + "Risk": "Ensure that the IAM policy associated with your Cloud Key Management Service (KMS) keys is restricting anonymous and/or public access", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudKMS/publicly-accessible-kms-cryptokeys.html", + "Terraform": "https://docs.bridgecrew.io/docs/ensure-that-cloud-kms-cryptokeys-are-not-anonymously-or-publicly-accessible#terraform" + }, + "Recommendation": { + "Text": "To deny access from anonymous and public users, remove the bindings for 'allUsers' and 'allAuthenticatedUsers' members from the KMS key's IAM policy.", + "Url": "https://cloud.google.com/kms/docs/iam" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.py b/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.py new file mode 100644 index 00000000..32645eb1 --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_key_not_publicly_accessible/kms_key_not_publicly_accessible.py @@ -0,0 +1,24 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.kms.kms_client import kms_client + + +class kms_key_not_publicly_accessible(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for key in kms_client.crypto_keys: + report = Check_Report_GCP(self.metadata()) + report.project_id = kms_client.project_id + report.resource_id = key.name + report.resource_name = key.name + report.location = key.location + report.status = "PASS" + report.status_extended = f"Key {key.name} is not exposed to Public." + for member in key.members: + if member == "allUsers" or member == "allAuthenticatedUsers": + report.status = "FAIL" + report.status_extended = ( + f"Key {key.name} may be publicly accessible!" + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/__init__.py b/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.metadata.json b/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.metadata.json new file mode 100644 index 00000000..62803f12 --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "kms_key_rotation_enabled", + "CheckTitle": "Ensure KMS keys are rotated within a period of 90 days", + "CheckType": [], + "ServiceName": "kms", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "low", + "ResourceType": "CryptoKey", + "Description": "Ensure KMS keys are rotated within a period of 90 days", + "Risk": "Ensure that all your Cloud Key Management Service (KMS) keys are rotated within a period of 90 days in order to meet security and compliance requirements", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "https://docs.bridgecrew.io/docs/bc_gcp_general_4#cli-command", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudKMS/rotate-kms-encryption-keys.html", + "Terraform": "https://docs.bridgecrew.io/docs/bc_gcp_general_4#terraform" + }, + "Recommendation": { + "Text": "After a successful key rotation, the older key version is required in order to decrypt the data encrypted by that previous key version.", + "Url": "https://cloud.google.com/iam/docs/manage-access-service-accounts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.py b/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.py new file mode 100644 index 00000000..0ecd5c98 --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_key_rotation_enabled/kms_key_rotation_enabled.py @@ -0,0 +1,28 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.kms.kms_client import kms_client + + +class kms_key_rotation_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + for key in kms_client.crypto_keys: + report = Check_Report_GCP(self.metadata()) + report.project_id = kms_client.project_id + report.resource_id = key.name + report.resource_name = key.name + report.location = key.location + report.status = "FAIL" + report.status_extended = ( + f"Key {key.name} is not rotated every 90 days or less." + ) + if key.rotation_period: + if ( + int(key.rotation_period[:-1]) // (24 * 3600) <= 90 + ): # Convert seconds to days and check if less or equal than 90 + report.status = "PASS" + report.status_extended = ( + f"Key {key.name} is rotated every 90 days or less." + ) + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/kms/kms_service.py b/prowler/providers/gcp/services/kms/kms_service.py new file mode 100644 index 00000000..01b49065 --- /dev/null +++ b/prowler/providers/gcp/services/kms/kms_service.py @@ -0,0 +1,142 @@ +from typing import Optional + +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## KMS +class KMS: + def __init__(self, audit_info): + self.service = "cloudkms" + self.api_version = "v1" + self.project_id = audit_info.project_id + self.region = "global" + self.client = generate_client(self.service, self.api_version, audit_info) + self.locations = [] + self.key_rings = [] + self.crypto_keys = [] + self.__get_locations__() + self.__get_key_rings__() + self.__get_crypto_keys__() + self.__get_crypto_keys_iam_policy__() + + def __get_client__(self): + return self.client + + def __get_locations__(self): + try: + request = ( + self.client.projects() + .locations() + .list(name="projects/" + self.project_id) + ) + while request is not None: + response = request.execute() + + for location in response["locations"]: + self.locations.append(location["name"]) + + request = ( + self.client.projects() + .locations() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_key_rings__(self): + try: + for location in self.locations: + request = ( + self.client.projects().locations().keyRings().list(parent=location) + ) + while request is not None: + response = request.execute() + + for ring in response.get("keyRings", []): + self.key_rings.append( + KeyRing( + name=ring["name"], + ) + ) + + request = ( + self.client.projects() + .locations() + .keyRings() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_crypto_keys__(self): + try: + for ring in self.key_rings: + request = ( + self.client.projects() + .locations() + .keyRings() + .cryptoKeys() + .list(parent=ring.name) + ) + while request is not None: + response = request.execute() + + for key in response.get("cryptoKeys", []): + self.crypto_keys.append( + CriptoKey( + name=key["name"].split("/")[-1], + location=key["name"].split("/")[3], + rotation_period=key.get("rotationPeriod"), + key_ring=ring.name, + ) + ) + + request = ( + self.client.projects() + .locations() + .keyRings() + .cryptoKeys() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_crypto_keys_iam_policy__(self): + try: + for key in self.crypto_keys: + request = ( + self.client.projects() + .locations() + .keyRings() + .cryptoKeys() + .getIamPolicy(resource=key.key_ring + "/cryptoKeys/" + key.name) + ) + response = request.execute() + + for binding in response.get("bindings", []): + key.members.extend(binding.get("members", [])) + except Exception as error: + logger.error( + f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class KeyRing(BaseModel): + name: str + + +class CriptoKey(BaseModel): + name: str + location: str + rotation_period: Optional[str] + key_ring: str + members: list = [] diff --git a/prowler/providers/gcp/services/logging/__init__.py b/prowler/providers/gcp/services/logging/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_client.py b/prowler/providers/gcp/services/logging/logging_client.py new file mode 100644 index 00000000..2eb45eec --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.logging.logging_service import Logging + +logging_client = Logging(gcp_audit_info) diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.metadata.json new file mode 100644 index 00000000..0a24925a --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for Audit Configuration Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for Audit Configuration Changes.", + "Risk": "Admin Activity audit logs and Data Access audit logs produced by the Google Cloud Audit Logs service can be extremely useful for security analysis, resource change tracking, and compliance auditing.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-audit-configuration-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "By using Google Cloud alerting policies to detect audit configuration changes, you make sure that the recommended state of audit configuration is well maintained so that all the activities performed within your GCP project are available for security analysis and auditing at any point in time.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.py new file mode 100644 index 00000000..a00ba311 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled.py @@ -0,0 +1,41 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_audit_configuration_changes_enabled( + Check +): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'protoPayload.methodName="SetIamPolicy" AND protoPayload.serviceData.policyDelta.auditConfigDeltas:*' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.metadata.json new file mode 100644 index 00000000..77df1a03 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for Cloud Storage IAM Permission Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for Cloud Storage IAM Permission Changes.", + "Risk": "Monitoring changes to cloud storage bucket permissions may reduce the time needed to detect and correct permissions on sensitive cloud storage buckets and objects inside the bucket.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-bucket-permission-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for Cloud Storage Bucket IAM changes.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.py new file mode 100644 index 00000000..0f89bdb4 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled/logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_bucket_permission_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'resource.type="gcs_bucket" AND protoPayload.methodName="storage.setIamPermissions"' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.metadata.json new file mode 100644 index 00000000..18bfff28 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_custom_role_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for Custom Role Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for Custom Role Changes.", + "Risk": "Google Cloud IAM provides predefined roles that give granular access to specific Google Cloud Platform resources and prevent unwanted access to other resources.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-custom-role-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for changes to Identity and Access Management (IAM) role creation, deletion and updating activities.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.py new file mode 100644 index 00000000..63445952 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled/logging_log_metric_filter_and_alert_for_custom_role_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_custom_role_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'resource.type="iam_role" AND (protoPayload.methodName="google.iam.admin.v1.CreateRole" OR protoPayload.methodName="google.iam.admin.v1.DeleteRole" OR protoPayload.methodName="google.iam.admin.v1.UpdateRole")' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.metadata.json new file mode 100644 index 00000000..a488b65f --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled", + "CheckTitle": "Ensure Log Metric Filter and Alerts Exist for Project Ownership Assignments/Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure Log Metric Filter and Alerts Exist for Project Ownership Assignments/Changes.", + "Risk": "Project ownership has the highest level of privileges on a GCP project. These privileges include viewer permissions on all GCP services inside the project, permission to modify the state of all GCP services within the project, set up billing and manage roles and permissions for the project and all the resources inside the project.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "s", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-ownership-assignments-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "Using Google Cloud alerting policies to detect ownership assignments/changes will help you maintain the right access permissions for each IAM member created within your project, follow the security principle of least privilege, and prevent any accidental or intentional changes that may lead to unauthorized actions.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.py new file mode 100644 index 00000000..a85d8865 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled/logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_project_ownership_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + '(protoPayload.serviceName="cloudresourcemanager.googleapis.com") AND (ProjectOwnership OR projectOwnerInvitee) OR (protoPayload.serviceData.policyDelta.bindingDeltas.action="REMOVE" AND protoPayload.serviceData.policyDelta.bindingDeltas.role="roles/owner") OR (protoPayload.serviceData.policyDelta.bindingDeltas.action="ADD" AND protoPayload.serviceData.policyDelta.bindingDeltas.role="roles/owner")' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.metadata.json new file mode 100644 index 00000000..480493fd --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for SQL Instance Configuration Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for SQL Instance Configuration Changes.", + "Risk": "Monitoring changes to SQL instance configuration changes may reduce the time needed to detect and correct misconfigurations done on the SQL server.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-network-route-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for SQL instance configuration changes.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.py new file mode 100644 index 00000000..c77a0b12 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled/logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled.py @@ -0,0 +1,41 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_sql_instance_configuration_changes_enabled( + Check +): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'protoPayload.methodName="cloudsql.instances.update"' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.metadata.json new file mode 100644 index 00000000..74962c8f --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Firewall Rule Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Firewall Rule Changes.", + "Risk": "Monitoring for Create or Update Firewall rule events gives insight to network access changes and may reduce the time it takes to detect suspicious activity.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-firewall-rule-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for Virtual Private Cloud (VPC) Network Firewall rule changes.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.py new file mode 100644 index 00000000..68e90f0a --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_vpc_firewall_rule_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'resource.type="gce_firewall_rule" AND (protoPayload.methodName:"compute.firewalls.patch" OR protoPayload.methodName:"compute.firewalls.insert" OR protoPayload.methodName:"compute.firewalls.delete")' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.metadata.json new file mode 100644 index 00000000..e6b1371d --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Changes.", + "Risk": "Monitoring changes to a VPC will help ensure VPC traffic flow is not getting impacted.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-vpc-network-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for Virtual Private Cloud (VPC) network changes.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.py new file mode 100644 index 00000000..852cf62a --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_vpc_network_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'resource.type="gce_network" AND (protoPayload.methodName:"compute.networks.insert" OR protoPayload.methodName:"compute.networks.patch" OR protoPayload.methodName:"compute.networks.delete" OR protoPayload.methodName:"compute.networks.removePeering" OR protoPayload.methodName:"compute.networks.addPeering")' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/__init__.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.metadata.json b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.metadata.json new file mode 100644 index 00000000..928bcb04 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled", + "CheckTitle": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Route Changes.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "MetricFilter", + "Description": "Ensure That the Log Metric Filter and Alerts Exist for VPC Network Route Changes.", + "Risk": "Monitoring changes to route tables will help ensure that all VPC traffic flows through an expected path.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/enable-network-route-changes-monitoring.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended that a metric filter and alarm be established for Virtual Private Cloud (VPC) network route changes.", + "Url": "https://cloud.google.com/monitoring/alerts" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.py b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.py new file mode 100644 index 00000000..0d962fc8 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled/logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled.py @@ -0,0 +1,39 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client +from prowler.providers.gcp.services.monitoring.monitoring_client import ( + monitoring_client, +) + + +class logging_log_metric_filter_and_alert_for_vpc_network_route_changes_enabled(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = "There are no log metric filters or alerts associated." + if logging_client.metrics: + for metric in logging_client.metrics: + if ( + 'resource.type="gce_route" AND (protoPayload.methodName:"compute.routes.delete" OR protoPayload.methodName:"compute.routes.insert")' + in metric.filter + ): + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = metric.name + report.resource_name = metric.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Log metric filter {metric.name} found but no alerts associated." + for alert_policy in monitoring_client.alert_policies: + for filter in alert_policy.filters: + if metric.name in filter: + report.status = "PASS" + report.status_extended = f"Log metric filter {metric.name} found with alert policy {alert_policy.display_name} associated." + break + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/logging/logging_service.py b/prowler/providers/gcp/services/logging/logging_service.py new file mode 100644 index 00000000..38903377 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_service.py @@ -0,0 +1,82 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## Logging +class Logging: + def __init__(self, audit_info): + self.service = "logging" + self.api_version = "v2" + self.region = "global" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.sinks = [] + self.metrics = [] + self.__get_sinks__() + self.__get_metrics__() + + def __get_sinks__(self): + try: + request = self.client.sinks().list(parent=f"projects/{self.project_id}") + while request is not None: + response = request.execute() + + for sink in response.get("sinks", []): + self.sinks.append( + Sink( + name=sink["name"], + destination=sink["destination"], + filter=sink.get("filter", "all"), + ) + ) + + request = self.client.sinks().list_next( + previous_request=request, previous_response=response + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + def __get_metrics__(self): + try: + request = ( + self.client.projects() + .metrics() + .list(parent=f"projects/{self.project_id}") + ) + while request is not None: + response = request.execute() + + for metric in response.get("metrics", []): + self.metrics.append( + Metric( + name=metric["name"], + type=metric["metricDescriptor"]["type"], + filter=metric["filter"], + ) + ) + + request = ( + self.client.projects() + .metrics() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class Sink(BaseModel): + name: str + destination: str + filter: str + + +class Metric(BaseModel): + name: str + type: str + filter: str diff --git a/prowler/providers/gcp/services/logging/logging_sink_created/__init__.py b/prowler/providers/gcp/services/logging/logging_sink_created/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.metadata.json b/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.metadata.json new file mode 100644 index 00000000..e715eb34 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.metadata.json @@ -0,0 +1,30 @@ +{ + "Provider": "gcp", + "CheckID": "logging_sink_created", + "CheckTitle": "Ensure there is at least one sink used to export copies of all the log entries.", + "CheckType": [], + "ServiceName": "logging", + "SubServiceName": "", + "ResourceIdTemplate": "", + "Severity": "medium", + "ResourceType": "Sink", + "Description": "Ensure there is at least one sink used to export copies of all the log entries.", + "Risk": "If sinks are not created, logs would be deleted after the configured retention period, and would not be backed up.", + "RelatedUrl": "", + "Remediation": { + "Code": { + "CLI": "gcloud logging sinks create ", + "NativeIaC": "", + "Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/gcp/CloudLogging/export-all-log-entries.html", + "Terraform": "" + }, + "Recommendation": { + "Text": "It is recommended to create a sink that will export copies of all the log entries. This can help aggregate logs from multiple projects and export them to a Security Information and Event Management (SIEM).", + "Url": "https://cloud.google.com/logging/docs/export" + } + }, + "Categories": [], + "DependsOn": [], + "RelatedTo": [], + "Notes": "" +} diff --git a/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.py b/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.py new file mode 100644 index 00000000..2ac70f43 --- /dev/null +++ b/prowler/providers/gcp/services/logging/logging_sink_created/logging_sink_created.py @@ -0,0 +1,32 @@ +from prowler.lib.check.models import Check, Check_Report_GCP +from prowler.providers.gcp.services.logging.logging_client import logging_client + + +class logging_sink_created(Check): + def execute(self) -> Check_Report_GCP: + findings = [] + if not logging_client.sinks: + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = "" + report.resource_name = "" + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = ( + "There are no logging sinks to export copies of all the log entries" + ) + else: + for sink in logging_client.sinks: + report = Check_Report_GCP(self.metadata()) + report.project_id = logging_client.project_id + report.resource_id = sink.name + report.resource_name = sink.name + report.location = logging_client.region + report.status = "FAIL" + report.status_extended = f"Sink {sink.name} is enabled but not exporting copies of all the log entries" + if sink.filter == "all": + report.status = "PASS" + report.status_extended = f"Sink {sink.name} is enabled exporting copies of all the log entries" + findings.append(report) + + return findings diff --git a/prowler/providers/gcp/services/monitoring/__init__.py b/prowler/providers/gcp/services/monitoring/__init__.py new file mode 100644 index 00000000..e69de29b diff --git a/prowler/providers/gcp/services/monitoring/monitoring_client.py b/prowler/providers/gcp/services/monitoring/monitoring_client.py new file mode 100644 index 00000000..e20cd09e --- /dev/null +++ b/prowler/providers/gcp/services/monitoring/monitoring_client.py @@ -0,0 +1,4 @@ +from prowler.providers.gcp.lib.audit_info.audit_info import gcp_audit_info +from prowler.providers.gcp.services.monitoring.monitoring_service import Monitoring + +monitoring_client = Monitoring(gcp_audit_info) diff --git a/prowler/providers/gcp/services/monitoring/monitoring_service.py b/prowler/providers/gcp/services/monitoring/monitoring_service.py new file mode 100644 index 00000000..d2da497a --- /dev/null +++ b/prowler/providers/gcp/services/monitoring/monitoring_service.py @@ -0,0 +1,56 @@ +from pydantic import BaseModel + +from prowler.lib.logger import logger +from prowler.providers.gcp.gcp_provider import generate_client + + +################## Monitoring +class Monitoring: + def __init__(self, audit_info): + self.service = "monitoring" + self.api_version = "v3" + self.region = "global" + self.project_id = audit_info.project_id + self.client = generate_client(self.service, self.api_version, audit_info) + self.alert_policies = [] + self.__get_alert_policies__() + + def __get_alert_policies__(self): + try: + request = ( + self.client.projects() + .alertPolicies() + .list(name=f"projects/{self.project_id}") + ) + while request is not None: + response = request.execute() + + for policy in response.get("alertPolicies", []): + filters = [] + for condition in policy["conditions"]: + filters.append(condition["conditionThreshold"]["filter"]) + self.alert_policies.append( + AlertPolicy( + name=policy["name"], + display_name=policy["displayName"], + enabled=policy["enabled"], + filters=filters, + ) + ) + + request = ( + self.client.projects() + .alertPolicies() + .list_next(previous_request=request, previous_response=response) + ) + except Exception as error: + logger.error( + f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}" + ) + + +class AlertPolicy(BaseModel): + name: str + display_name: str + filters: list[str] + enabled: bool diff --git a/pyproject.toml b/pyproject.toml index 3f0f597d..1b3f523a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -37,6 +37,7 @@ boto3 = "1.26.90" botocore = "1.29.90" colorama = "0.4.6" detect-secrets = "1.4.0" +google-api-python-client = "2.81.0" mkdocs = {version = "1.4.2", optional = true} mkdocs-material = {version = "9.1.3", optional = true} msgraph-core = "0.2.2" diff --git a/tests/lib/cli/parser_test.py b/tests/lib/cli/parser_test.py index a2f49bd8..a171852b 100644 --- a/tests/lib/cli/parser_test.py +++ b/tests/lib/cli/parser_test.py @@ -92,6 +92,38 @@ class Test_Parser: assert not parsed.browser_auth assert not parsed.managed_identity_auth + def test_default_parser_no_arguments_gcp(self): + provider = "gcp" + command = [prowler_command, provider] + parsed = self.parser.parse(command) + assert parsed.provider == provider + assert not parsed.quiet + assert len(parsed.output_modes) == 3 + assert "csv" in parsed.output_modes + assert "html" in parsed.output_modes + assert "json" in parsed.output_modes + assert not parsed.output_filename + assert "output" in parsed.output_directory + assert not parsed.verbose + assert not parsed.no_banner + assert parsed.log_level == "CRITICAL" + assert not parsed.log_file + assert not parsed.only_logs + assert not parsed.checks + assert not parsed.checks_file + assert not parsed.services + assert not parsed.severity + assert not parsed.compliance + assert len(parsed.categories) == 0 + assert not parsed.excluded_checks + assert not parsed.excluded_services + assert not parsed.list_checks + assert not parsed.list_services + assert not parsed.list_compliance + assert not parsed.list_compliance_requirements + assert not parsed.list_categories + assert not parsed.credentials_file + def test_root_parser_version_short(self): command = [prowler_command, "-v"] with pytest.raises(SystemExit) as wrapped_exit: @@ -136,6 +168,12 @@ class Test_Parser: print(parsed) assert parsed.provider == "azure" + def test_root_parser_gcp_provider(self): + command = [prowler_command, "gcp"] + parsed = self.parser.parse(command) + print(parsed) + assert parsed.provider == "gcp" + def test_root_parser_quiet_short(self): command = [prowler_command, "-q"] parsed = self.parser.parse(command) @@ -901,3 +939,11 @@ class Test_Parser: _ = self.parser.parse(command) assert wrapped_exit.type == SystemExit assert wrapped_exit.value.code == 2 + + def test_parser_gcp_auth_credentials_file(self): + argument = "--credentials-file" + file = "test.json" + command = [prowler_command, "gcp", argument, file] + parsed = self.parser.parse(command) + assert parsed.provider == "gcp" + assert parsed.credentials_file == file diff --git a/tests/providers/common/audit_info_test.py b/tests/providers/common/audit_info_test.py index e787524a..c877cb9e 100644 --- a/tests/providers/common/audit_info_test.py +++ b/tests/providers/common/audit_info_test.py @@ -22,6 +22,8 @@ from prowler.providers.common.audit_info import ( get_tagged_resources, set_provider_audit_info, ) +from prowler.providers.gcp.gcp_provider import GCP_Provider +from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info EXAMPLE_AMI_ID = "ami-12c6146b" ACCOUNT_ID = 123456789012 @@ -70,10 +72,14 @@ def mock_set_identity_info(*_): return Azure_Identity_Info() -def mock_set_credentials(*_): +def mock_set_azure_credentials(*_): return {} +def mock_set_gcp_credentials(*_): + return (None, None) + + class Test_Set_Audit_Info: @patch( "prowler.providers.common.audit_info.current_audit_info", @@ -168,9 +174,7 @@ class Test_Set_Audit_Info: new=mock_current_audit_info, ) @patch.object(Audit_Info, "validate_credentials", new=mock_validate_credentials) - @patch.object( - Audit_Info, "print_audit_credentials", new=mock_print_audit_credentials - ) + @patch.object(Audit_Info, "print_aws_credentials", new=mock_print_audit_credentials) def test_set_audit_info_aws(self): provider = "aws" arguments = { @@ -194,7 +198,7 @@ class Test_Set_Audit_Info: "prowler.providers.common.audit_info.azure_audit_info", new=mock_azure_audit_info, ) - @patch.object(Azure_Provider, "__set_credentials__", new=mock_set_credentials) + @patch.object(Azure_Provider, "__set_credentials__", new=mock_set_azure_credentials) @patch.object(Azure_Provider, "__set_identity_info__", new=mock_set_identity_info) def test_set_audit_info_azure(self): provider = "azure" @@ -216,6 +220,25 @@ class Test_Set_Audit_Info: audit_info = set_provider_audit_info(provider, arguments) assert isinstance(audit_info, Azure_Audit_Info) + @patch.object(GCP_Provider, "__set_credentials__", new=mock_set_gcp_credentials) + @patch.object(Audit_Info, "print_gcp_credentials", new=mock_print_audit_credentials) + def test_set_audit_info_gcp(self): + provider = "gcp" + arguments = { + "profile": None, + "role": None, + "session_duration": None, + "external_id": None, + "regions": None, + "organizations_role": None, + "subscriptions": None, + # We need to set exactly one auth method + "credentials_file": None, + } + + audit_info = set_provider_audit_info(provider, arguments) + assert isinstance(audit_info, GCP_Audit_Info) + @mock_resourcegroupstaggingapi @mock_ec2 def test_get_tagged_resources(self):