Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(S3_in_w_x_flags): Support S3 URIs for custom checks paths and whitelist files. #1090

Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -292,11 +292,12 @@ Prowler has two parameters related to regions: `-r` that is used query AWS servi

>Note about output formats to use with `-M`: "text" is the default one with colors, "mono" is like default one but monochrome, "csv" is comma separated values, "json" plain basic json (without comma between lines) and "json-asff" is also json with Amazon Security Finding Format that you can ship to Security Hub using `-S`.

or save your report in an S3 bucket (this only works for text or mono. For csv, json or json-asff it has to be copied afterwards):
To save your report in an S3 bucket, use `-B` to define a custom output bucket along with `-M` to defiine the output format that is going to be updated to S3:

```sh
./prowler -M mono | aws s3 cp - s3://bucket-name/prowler-report.txt
./prowler -M csv -B my-bucket/folder/
```
>In the case you do not want to use the assumed role credentials but the initial credentials to put the reports into the S3 bucket, use `-D` instead of `-B`.

When generating multiple formats and running using Docker, to retrieve the reports, bind a local directory to the container, e.g.:

Expand Down Expand Up @@ -399,7 +400,9 @@ Prowler runs in GovCloud regions as well. To make sure it points to the right AP

### Custom folder for custom checks

Flag `-x /my/own/checks` will include any check in that particular directory. To see how to write checks see [Add Custom Checks](#add-custom-checks) section.
Flag `-x /my/own/checks` will include any check in that particular directory (files must start by check). To see how to write checks see [Add Custom Checks](#add-custom-checks) section.

S3 URIs are also supported as custom folders for custom checks, e.g. `s3://bucket/prefix/checks`

### Show or log only FAILs

Expand Down Expand Up @@ -488,6 +491,8 @@ Sometimes you may find resources that are intentionally configured in a certain
./prowler -w whitelist_sample.txt
```

S3 URIs are also supported as allowlist file, e.g. `s3://bucket/prefix/allowlist_sample.txt`

Whitelist option works along with other options and adds a `WARNING` instead of `INFO`, `PASS` or `FAIL` to any output format except for `json-asff`.

## How to fix every FAIL
Expand Down
43 changes: 43 additions & 0 deletions include/allowlist
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
#!/usr/bin/env bash

# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.

allowlist(){
# check if the file is an S3 URI
if grep -q -E "^s3://([^/]+)/(.*?([^/]+))$" <<< "$ALLOWLIST_FILE"; then
# download s3 object
echo -e "$NOTICE Downloading allowlist from S3 URI "$ALLOWLIST_FILE" ..."
if ! $AWSCLI s3 cp $ALLOWLIST_FILE allowlist_s3_file.txt $PROFILE_OPT > /dev/null 2>&1; then
echo "$BAD FAIL! Access Denied trying to download allowlist from the S3 URI, please make sure it is correct and/or you have permissions to get the S3 object."
EXITCODE=1
exit $EXITCODE
fi
echo -e "$OK Success! Allowlist was downloaded, starting Prowler..."
ALLOWLIST_FILE=allowlist_s3_file.txt
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
# remove temporary file
rm -f "$ALLOWLIST_FILE"
else
# Check if input allowlist file exists
if [[ -f "$ALLOWLIST_FILE" ]]; then
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
else
echo "$BAD FAIL! $ALLOWLIST_FILE does not exist, please input a valid allowlist file."
EXITCODE=1
exit $EXITCODE
fi
fi
}
52 changes: 52 additions & 0 deletions include/custom_checks
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
#!/usr/bin/env bash

# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy
# of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
# specific language governing permissions and limitations under the License.

custom_checks(){
# check if the path is an S3 URI
if grep -q -E "^s3://([^/]+)/?(.*?([^/]+)/?)?$" <<< "$EXTERNAL_CHECKS_PATH"; then
if grep -q "check*" <<< "$($AWSCLI s3 ls "$EXTERNAL_CHECKS_PATH" $PROFILE_OPT)"; then
# download s3 object
echo -e "$NOTICE Downloading custom checks from S3 URI $EXTERNAL_CHECKS_PATH..."
S3_CHECKS_TEMP_FOLDER="$PROWLER_DIR/s3-custom-checks"
mkdir "${S3_CHECKS_TEMP_FOLDER}"
$AWSCLI s3 sync "$EXTERNAL_CHECKS_PATH" "${S3_CHECKS_TEMP_FOLDER}" $PROFILE_OPT > /dev/null
# verify if there are checks
for checks in "${S3_CHECKS_TEMP_FOLDER}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!"
done
echo -e "$OK Success! Custom checks were downloaded and included, starting Prowler..."
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
else
echo "$BAD FAIL! Access Denied trying to download custom checks or $EXTERNAL_CHECKS_PATH does not contain any checks, please make sure it is correct and/or you have permissions to get the S3 objects."
EXITCODE=1
# remove temporary dir
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
exit $EXITCODE
fi
else
# verify if input directory exists with checks
if ls "${EXTERNAL_CHECKS_PATH}"/check* > /dev/null 2>&1; then
for checks in "${EXTERNAL_CHECKS_PATH}"/check*; do
. "$checks"
echo -e "$OK Check $(basename "$checks") was included!"
done
echo -e "$OK Success! Custom checks were included, starting Prowler..."
else
echo "$BAD FAIL! $EXTERNAL_CHECKS_PATH does not exist or not contain checks, please input a valid custom checks path."
EXITCODE=1
exit $EXITCODE
fi
fi
}
28 changes: 13 additions & 15 deletions prowler
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ FAILED_CHECK_FAILED_SCAN=1
PROWLER_START_TIME=$( date -u +"%Y-%m-%dT%H:%M:%S%z" )
TITLE_ID=""
TITLE_TEXT="CALLER ERROR - UNSET TITLE"
WHITELIST_FILE=""
ALLOWLIST_FILE=""
TOTAL_CHECKS=()

# Ensures command output will always be set to JSON.
Expand Down Expand Up @@ -88,8 +88,8 @@ USAGE:
-s Show scoring report (it is included by default in the html report).
-S Send check output to AWS Security Hub. Only valid when the output mode is json-asff
(i.e. "-M json-asff -S").
-x Specify external directory with custom checks
(i.e. /my/own/checks, files must start by "check").
-x Specify external directory with custom checks. S3 URI is supported.
(i.e. /my/own/checks or s3://bucket/prefix/checks, files must start by "check").
-q Get only FAIL findings, will show WARNINGS when a resource is excluded.
-A Account id for the account where to assume a role, requires -R.
(i.e.: 123456789012)
Expand All @@ -98,8 +98,8 @@ USAGE:
-T Session duration given to that role credentials in seconds, default 1h (3600) recommended 12h, optional with -R and -A.
(i.e.: 43200)
-I External ID to be used when assuming roles (not mandatory), requires -A and -R.
-w Whitelist file. See whitelist_sample.txt for reference and format.
(i.e.: whitelist_sample.txt)
-w Allowlist file. See allowlist_sample.txt for reference and format. S3 URI is supported.
(i.e.: allowlist_sample.txt or s3://bucket/prefix/allowlist_sample.txt)
-N <shodan_api_key> Shodan API key used by check extra7102.
-o Custom output directory, if not specified will use default prowler/output, requires -M <mode>.
(i.e.: -M csv -o /tmp/reports/)
Expand Down Expand Up @@ -201,7 +201,7 @@ while getopts ":hlLkqp:r:c:C:g:f:m:M:E:x:enbVsSI:A:R:T:w:N:o:B:D:F:zZ:O:" OPTION
SESSION_DURATION_TO_ASSUME=$OPTARG
;;
w )
WHITELIST_FILE=$OPTARG
ALLOWLIST_FILE=$OPTARG
;;
N )
SHODAN_API_KEY=$OPTARG
Expand Down Expand Up @@ -294,6 +294,8 @@ unset AWS_DEFAULT_OUTPUT
. $PROWLER_DIR/include/securityhub_integration
. $PROWLER_DIR/include/junit_integration
. $PROWLER_DIR/include/organizations_metadata
. $PROWLER_DIR/include/custom_checks
. $PROWLER_DIR/include/allowlist

# Parses the check file into CHECK_ID's.
if [[ -n "$CHECK_FILE" ]]; then
Expand All @@ -308,11 +310,9 @@ if [[ -n "$CHECK_FILE" ]]; then
fi
fi

# Pre-process whitelist file if supplied
if [[ -n "$WHITELIST_FILE" ]]; then
# ignore lines starting with # (comments)
# ignore inline comments: check1:foo # inline comment
WHITELIST="$(awk '!/^[[:space:]]*#/{print }' <(cat "$WHITELIST_FILE") | sed 's/[[:space:]]*#.*$//g')"
# Pre-process allowlist file if supplied
if [[ -n "$ALLOWLIST_FILE" ]]; then
allowlist
fi

# Load all of the groups of checks inside groups folder named as "groupNumber*"
Expand All @@ -328,9 +328,7 @@ done

# include checks if external folder is specified
if [[ $EXTERNAL_CHECKS_PATH ]]; then
for checks in $(ls $EXTERNAL_CHECKS_PATH/check*); do
. "$checks"
done
custom_checks
fi

# Get a list of total checks available by ID
Expand Down Expand Up @@ -462,7 +460,7 @@ execute_check() {
# Generate the credential report, only if it is group1 related which checks we
# run so that the checks can safely assume it's available
# set the custom ignores list for this check
ignores="$(awk "/${1}/{print}" <(echo "${WHITELIST}"))"
ignores="$(awk "/${1}/{print}" <(echo "${ALLOWLIST}"))"

if [ ${alternate_name} ];then
if [[ ${alternate_name} == check1* || ${alternate_name} == extra71 || ${alternate_name} == extra774 || ${alternate_name} == extra7123 ]];then
Expand Down