Skip to main content

Data Protection Metric Dashboard

ALTR’s data protection metrics dashboard allows you to identify and quantify risks associated with data security posture. The dashboard highlights data assets that have not been properly categorized as sensitive, as well as sensitive data that lacks adequate monitoring or security controls. The dashboard serves as a critical tool for understanding gaps in data security and enforcing compliance with security protocols.

This guide outlines the steps to set up and use the developed resources in Snowflake, including user-defined functions (UDFs), stored procedures and tables, and how to integrate them with Power BI for visualization.

You will be able to

  • Create and manage tables that store the metadata from ALTR and Snowflake’s tag references table.

  • Develop UDFs and stored procedures to automate the identification of unmonitored data.

  • Establish connections to Power BI to visualize the data security risks and monitor security compliance.

Prerequisites

Before getting started, ensure you have

  • A Snowflake account with appropriate privileges (ACCOUNTADMIN access).

  • Power BI desktop or web access.

  • Basic understanding of SQL, Snowflake and PowerBI

Login and Navigation

To get started, log in to your Snowflake account using your credentials. Once logged in, you'll be directed to the Snowflake interface.

Create and Use Database and Schema

Before running any queries, run the following SQL to set up the database and schema to store the UDFs, stored procedures and relevant tables.

CREATE OR REPLACE DATABASE ALTR_ANALYTICS_DB;
CREATE OR REPLACE SCHEMA PROTECTION_METRICS;
USE DATABASE ALTR_ANALYTICS_DB;
USE SCHEMA PROTECTION_METRICS;

Create External Access Integration

To create external access integration:

  1. Create secrets by running the following query. For additional information on creating an API key and secret, refer to Manage API Keys.

    -- Create secrets
    CREATE OR REPLACE SECRET <SECRET_NAME>
    TYPE = PASSWORD
    USERNAME = '<ALTR_API_KEY>'
    PASSWORD = '<ALTR_API_SECRET>';
  2. Create network rule by running the following query:

    Note

    Before running the query, update <NETWORK_RULE_NAME> in line 2 with your environment-specific details.

    -- Create Network rule
    CREATE OR REPLACE NETWORK RULE <NETWORK_RULE_NAME>
        MODE = EGRESS
        TYPE = HOST_PORT
        VALUE_LIST = ('altrnet.live.altr.com','api.live.altr.com');
  3. Create external access integration by running the following query:

    Note

    Before running the query, update the following lines with your environment-specific details:

    • Line 2—<EAI_NAME>

    • Line 3—<NETWORK_RULE_NAME>

    • Line 4—<SECRET_NAME>

    -- Create external Access Integration    
    CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION <EAI_NAME>
        ALLOWED_NETWORK_RULES = (<NETWORK_RULE_NAME>)
        ALLOWED_AUTHENTICATION_SECRETS = (<SECRET_NAME>)
        ENABLED = True;

Create User-Defined Functions to Call API Endpoints

Before creating the external functions, ensure that you

Create user-defined functions to get the tag details on which policies are attached. Use the https://altrnet.live.altr.com/api/locks endpoints to obtain the data from ALTR.

Note

Before running the query, update the following lines with your environment-specific details:

  • Line 6—<SECRET_NAME>

  • Line 7—<EAI_NAME>

CREATE OR REPLACE FUNCTION ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_POLICY_EF()
RETURNS TABLE(COL VARIANT)
LANGUAGE PYTHON
RUNTIME_VERSION = '3.8'
PACKAGES = ('snowflake-snowpark-python', 'requests', 'pandas')
SECRETS = ('mapi_creds' = <SECRET_NAME>) 
EXTERNAL_ACCESS_INTEGRATIONS = (<EAI_NAME>)
HANDLER = 'ALTR_POLICY_EF'
AS
$$
import _snowflake
import json
import requests
from requests.auth import HTTPBasicAuth
from snowflake.snowpark import Session
class ALTR_POLICY_EF:
    def process(self):
            creds = _snowflake.get_username_password('mapi_creds')
            basic = HTTPBasicAuth(creds.username, creds.password)
            base_url = "https://altrnet.live.altr.com/api"
    
            url_locks = base_url + "/locks"
            
            response_lock = requests.get(url_locks, auth=basic)
            
            if response_lock.status_code == 200:
                data_locks = response_lock.json()
            
                # Prepare a dictionary to collect all responses
                combined_responses = {}
    
                for locks in data_locks["data"]["locks"]:
                    lock_id = locks["lockId"]
        
                    if isinstance(lock_id, list):
                        for single_lock_id in lock_id:
                            url_with_lock_id = f"{url_locks}/{single_lock_id}"
                            response_lockID = requests.get(url_with_lock_id, auth=basic)
                            combined_responses[single_lock_id] = response_lockID.json()
                    else:
                        url_with_lock_id = f"{url_locks}/{lock_id}"
                        response_lockID = requests.get(url_with_lock_id, auth=basic)
                        combined_responses[lock_id] = response_lockID.json()
        
                #Convert the dictionary to a JSON string
                json_responses = json.dumps(combined_responses)
                python_obj = json.loads(json_responses)
                return [(python_obj,)]
            else:
                print(f"Failed to retrieve data, status code: {response_lock.status_code}")
$$;

Create external functions to get the details of alert/threshold applied on the policies. Use the https://altrnet.live.altr.com/api/thresholds endpoints to obtain the data from ALTR.

To create the function that obtains alerts/threshold details, run the following query:

Note

Before running the query, update the following lines with your environment-specific details:

  • Line 6—<SECRET_NAME>

  • Line 7—<EAI_NAME>

CREATE OR REPLACE FUNCTION ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_MONITOR_EF()
RETURNS TABLE(COL VARIANT)
LANGUAGE PYTHON
RUNTIME_VERSION = '3.8'
PACKAGES = ('snowflake-snowpark-python', 'requests', 'pandas')
SECRETS = ('mapi_creds' = <SECRET_NAME>)
EXTERNAL_ACCESS_INTEGRATIONS = (<EAI_NAME>)
HANDLER = 'ALTR_MONITOR_EF'
AS
$$
import _snowflake
import json
import requests
from requests.auth import HTTPBasicAuth
from snowflake.snowpark import Session

class ALTR_MONITOR_EF:
    def process(self):
        creds = _snowflake.get_username_password('mapi_creds')
        basic_auth = HTTPBasicAuth(creds.username, creds.password)
        base_url = "https://altrnet.live.altr.com/api"
    
        url_locks = base_url + "/thresholds"
        
        # First request with auth
        response_lock = requests.get(url_locks, auth=basic_auth)
        
        if response_lock.status_code == 200:
            data_locks = response_lock.json()
            combined_responses = {}
            
            # Check if "data" is in the response
            if "data" in data_locks:
                thresholds = data_locks["data"]
                
                # Loop through each threshold to get detailed information
                for threshold in thresholds:
                    threshold_id = threshold.get("id")
                    if threshold_id is not None:
                        url_with_id = f"{url_locks}/{threshold_id}"
                        
                        # Use basic_auth for the second request
                        response_detail = requests.get(url_with_id, auth=basic_auth)
                        
                        if response_detail.status_code == 200:
                            combined_responses[threshold_id] = response_detail.json()
                        else:
                            combined_responses[threshold_id] = {
                                "error": f"Failed to retrieve details for ID {threshold_id}, status code: {response_detail.status_code}"
                            }
            else:
                print("Response does not contain expected 'data' field.")
                
            # Convert the dictionary to a JSON string
            json_responses = json.dumps(combined_responses)
            python_obj = json.loads(json_responses)
            return [(python_obj,)]
        else:
            print(f"Failed to retrieve data, status code: {response_lock.status_code}")
$$;

Create the external functions to get the details of the columns that are marked as tokenized in ALTR. Use the https://altrnet.live.altr.com/api/data endpoints to obtain the data from ALTR.

To create the function that obtains tokenize details, run the following query:

Note

Before running the query, update the following lines with your environment-specific details:

  • Line 6—<SECRET_NAME>

  • Line 7—<EAI_NAME>

CREATE OR REPLACE FUNCTION ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_TOKENIZE_EF()
RETURNS TABLE(COL VARIANT)
LANGUAGE PYTHON
RUNTIME_VERSION = '3.8'
PACKAGES = ('snowflake-snowpark-python', 'requests', 'pandas')
SECRETS = ('mapi_creds' = <SECRET_NAME>)
EXTERNAL_ACCESS_INTEGRATIONS = (<EAI_NAME>)
HANDLER = 'ALTR_TOKENIZE_EF'
AS
$$
import _snowflake
import json
import requests
from requests.auth import HTTPBasicAuth
from snowflake.snowpark import Session

class ALTR_TOKENIZE_EF:
    def process(self):
        creds = _snowflake.get_username_password('mapi_creds')
        basic_auth = HTTPBasicAuth(creds.username, creds.password)
        base_url = "https://altrnet.live.altr.com/api"
    
        url_locks = base_url + "/data"
        
        # First request with auth
        response_lock = requests.get(url_locks, auth=basic_auth)
        
        if response_lock.status_code == 200:
            data_locks = response_lock.json()
            return [(data_locks,)]
        else:
            print(f"Failed to retrieve data, status code: {response_lock.status_code}")
$$;

Create Stored Procedures to Create Tables

Before creating the stored procedures make sure you are using the correct database and schema. Use the database and schema created in Create and Use Database and Schema.

Note

Run the stored procedures in the specified sequence. These stored procedures create interlinked tables and following the correct order is essential for maintaining data integrity.

Create a stored procedure to call the user defined function created in Alerts/Threshold Details and the statements to create required tables. Use the database and schema that is created in Create and Use Database Schema.

To create the stored procedure to call the user defined function for alerts/thresholds, run the following query:

CREATE OR REPLACE PROCEDURE ALTR_MONITORING_SP()
RETURNS VARCHAR
LANGUAGE SQL
EXECUTE AS OWNER
AS
DECLARE
    CREATE_STATEMENT VARCHAR;
BEGIN
    --Create JSON_RECORDS table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE JSON_MONITOR_RECORD AS SELECT COL FROM TABLE (ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_MONITOR_EF())';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create RAW_JSON_REC table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE RAW_LOCK_MONITOR AS 
                            SELECT f.value:data AS col 
                            FROM (
                               SELECT col AS json_data
                               FROM JSON_MONITOR_RECORD
                            ), LATERAL FLATTEN(input => PARSE_JSON(json_data)) AS f';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create TAG_DETAILS table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE LOCK_MONITOR_INFO (
                            ALTR_LOCK_ID INT PRIMARY KEY,
                            ALTR_LOCK_NAME STRING
                        ) AS 
                        SELECT DISTINCT
                            lock.value:lockId::int AS ALTR_LOCK_ID,
                            lock.value:lockName::string AS ALTR_LOCK_NAME
                        FROM 
                            RAW_LOCK_MONITOR,
                            LATERAL FLATTEN(input => col:locks) AS lock';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    RETURN 'Tables created successfully.';
END;

To verify the successful execution of the stored procedure, run the following stored procedure. This step ensures that all components are functioning as needed.

CALL ALTR_MONITORING_SP();

Create a stored procedure to call the user defined function created in Tag Details and the statements to create required tables. Use the database and schema that is created in Create and Use Database Schema.

To create the stored procedure to call the user defined function for tags, run the following query:

CREATE OR REPLACE PROCEDURE ALTR_TAG_INFO_SP()
RETURNS VARCHAR
LANGUAGE SQL
EXECUTE AS OWNER
AS
DECLARE
    CREATE_STATEMENT VARCHAR;
BEGIN 
    --Create JSON_RECORDS table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_RAW_TAG_JSON_INFO AS SELECT COL FROM TABLE (ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_POLICY_EF())';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create RAW_JSON_REC table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_TAG_JSON_INFO AS 
                   SELECT f.value:data AS col 
                   FROM (
                       SELECT col AS json_data
                       FROM ALTR_RAW_TAG_JSON_INFO
                   ), LATERAL FLATTEN(input => PARSE_JSON(json_data)) AS f';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create TAG_DETAILS table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_TAG_INFO (
                            ALTR_TAG_NAME STRING,
                            ALTR_TAG_VALUE STRING,
                            ALTR_TAG_NAME_VAL STRING, 
                            PRIMARY KEY (ALTR_TAG_NAME_VAL)
                        ) AS  
                        SELECT DISTINCT
                            tag.value:tagName::string AS ALTR_TAG_NAME,
                            tag.value:tagValue::string AS ALTR_TAG_VALUE,
                            CASE 
                                WHEN tag.value:tagName::string IS NOT NULL AND tag.value:tagValue::string IS NOT NULL THEN 
                                    tag.value:tagName::string || ''.'' || tag.value:tagValue::string
                                WHEN tag.value:tagName::string IS NOT NULL THEN 
                                    tag.value:tagName::string
                                WHEN tag.value:tagValue::string IS NOT NULL THEN 
                                    tag.value:tagValue::string
                                ELSE 
                                    ''''
                            END AS ALTR_TAG_NAME_VAL
                        FROM 
                            ALTR_TAG_JSON_INFO AS col,
                            LATERAL FLATTEN(input => col:tags) AS tag';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create ALTR_LOCK_INFO table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_LOCK_INFO ( 
                            ALTR_LOCK_ID INTEGER,
                            ALTR_LOCK_NAME STRING,
                            PRIMARY KEY (ALTR_LOCK_ID)
                            ) AS  
                        SELECT DISTINCT
                            col:lockId::integer AS ALTR_LOCK_ID,
                            col:lockName::string AS ALTR_LOCK_NAME
                        FROM 
                            ALTR_TAG_JSON_INFO AS col';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;    

    -- Create ALTR_LOCK_INFO table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_POLICY_TAG_DETAILS (
                            ALTR_LOCK_ID INTEGER,
                            ALTR_TAG_NAME_VAL STRING,
                            FOREIGN KEY (ALTR_LOCK_ID) REFERENCES ALTR_LOCK_INFO (ALTR_LOCK_ID),
                            FOREIGN KEY (ALTR_TAG_NAME_VAL) REFERENCES ALTR_TAG_INFO (ALTR_TAG_NAME_VAL)
                            ) AS  
                        SELECT
                            col:lockId::integer AS ALTR_LOCK_ID,
                            CASE 
                                WHEN tag.value:tagName::string IS NOT NULL AND tag.value:tagValue::string IS NOT NULL THEN 
                                    tag.value:tagName::string || ''.'' || tag.value:tagValue::string
                                WHEN tag.value:tagName::string IS NOT NULL THEN 
                                    tag.value:tagName::string
                                WHEN tag.value:tagValue::string IS NOT NULL THEN 
                                    tag.value:tagValue::string
                                ELSE 
                                    ''''
                            END AS ALTR_TAG_NAME_VAL 
                        FROM 
                            ALTR_TAG_JSON_INFO AS col,
                            LATERAL FLATTEN(input => col:tags) AS tag';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;    
                   

    --Create TAG_METADATA table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_POLICY_TAG_INFO(
                            ALTR_LOCK_ID INTEGER,
                            ALTR_TAG_NAME_VAL STRING,
                            ALERT_ID INTEGER,
                            FOREIGN KEY (ALTR_LOCK_ID) REFERENCES ALTR_LOCK_INFO(ALTR_LOCK_ID),
                            FOREIGN KEY (ALTR_TAG_NAME_VAL) REFERENCES ALTR_TAG_INFO(ALTR_TAG_NAME_VAL),
                            FOREIGN KEY (ALERT_ID) REFERENCES LOCK_MONITOR_INFO(ALTR_LOCK_ID)
                            )
                            AS
                            SELECT APT.ALTR_LOCK_ID, APT.ALTR_TAG_NAME_VAL, LM.ALTR_LOCK_ID AS ALERT_ID FROM ALTR_POLICY_TAG_DETAILS APT
                            LEFT JOIN LOCK_MONITOR_INFO LM
                            ON LM.ALTR_LOCK_ID = APT.ALTR_LOCK_ID';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    RETURN 'Tables created successfully.';
END;

To verify the successful execution of the stored procedure, run the following stored procedure. This step will ensure that all components are functioning as needed.

CALL ALTR_TAG_INFO_SP();

Create a stored procedure to call the user defined function created in Tokenize Details and statements to create required tables. Use the database and schema that is created in Create and Use Database Schema.

To create the stored procedure to call the user defined function for tokenized columns, run the following query:

CREATE OR REPLACE PROCEDURE ALTR_TOKENIZE_SP()
RETURNS VARCHAR
LANGUAGE SQL
EXECUTE AS OWNER
AS
DECLARE
    CREATE_STATEMENT VARCHAR;
BEGIN 
    --Create JSON_RECORDS table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE TOKENIZE AS SELECT * FROM TABLE (ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_TOKENIZE_EF())';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    --Create RAW_JSON_REC table
    CREATE_STATEMENT := 'CREATE OR REPLACE TABLE ALTR_TOKENIZE_INFO
                        (
                            DATABASE_ID INTEGER,
                            DB_ID VARCHAR,       
                            DATABASE_NAME VARCHAR, 
                            SCHEMA_NAME VARCHAR,  
                            TABLE_NAME VARCHAR,   
                            COLUMN_NAME VARCHAR,   
                            TOKENIZE_STATUS INTEGER,
                            PRIMARY KEY (DB_ID)
                        ) AS
                        SELECT 
                            field.value:databaseId::INTEGER AS DATABASE_ID,
                            COALESCE(field.value:friendlyDatabaseName::STRING, ''.'') || ''.'' || 
                            COALESCE(SPLIT_PART(field.value:tableName::STRING, ''.'', 1), ''.'') || ''.'' ||
                            COALESCE(SPLIT_PART(field.value:tableName::STRING, ''.'', 2), ''.'') || ''.'' || 
                            COALESCE(field.value:columnName::STRING, ''.'') AS DB_ID,
                            field.value:friendlyDatabaseName::STRING AS DATABASE_NAME,
                            SPLIT_PART(field.value:tableName::STRING, ''.'', 1) AS SCHEMA_NAME,
                            SPLIT_PART(field.value:tableName::STRING, ''.'', 2) AS TABLE_NAME,
                            field.value:columnName::STRING AS COLUMN_NAME,
                            CASE 
                                WHEN field.value:scatterStatus::STRING = ''unscattered'' THEN 0
                                WHEN field.value:scatterStatus::STRING = ''tokenize'' THEN 1
                                ELSE NULL  -- You can adjust this based on your needs
                            END AS TOKENIZE_STATUS
                        FROM 
                            TOKENIZE,
                            LATERAL FLATTEN(input => COL:data:fields) AS field';
    EXECUTE IMMEDIATE :CREATE_STATEMENT;

    RETURN 'Tables created successfully.';
END;

To verify the successful execution of the stored procedure, run the following stored procedure. This step ensures that all components are functioning as needed.

CALL ALTR_TOKENIZE_SP();

Create a stored procedure to execute SQL statements for creating the tables associated with tags in Snowflake. Use the database and schema that is created in Create and Use Database Schema.

To create the stored procedure to call the user defined function for Snowflake metadata, run the following query:

USE DATABASE ALTR_ANALYTICS_DB;
USE SCHEMA PROTECTION_METRICS;

-- Run to a stored procedure to create required tables.
CREATE OR REPLACE PROCEDURE ALTR_ANALYTICS_DB.PROTECTION_METRICS.ALTR_SF_METADATA_SP(USE_TAG_VALUE BOOLEAN, DB_NAME VARCHAR)
RETURNS VARCHAR
LANGUAGE SQL
EXECUTE AS OWNER
AS
DECLARE
    CREATE_STATEMENT VARCHAR;
BEGIN
    IF (USE_TAG_VALUE=FALSE) THEN
        CREATE_STATEMENT := 'CREATE OR REPLACE TABLE SNOW_TAG_INFO 
                                (SNOW_TAG_NM_VAL STRING,
                                ALTR_TAG_NAME_VAL STRING,
                                TAG_NAME STRING,
                                TAG_VALUE STRING,
                                PRIMARY KEY (SNOW_TAG_NM_VAL),
                                FOREIGN KEY (ALTR_TAG_NAME_VAL) REFERENCES ALTR_TAG_INFO(ALTR_TAG_NAME_VAL))
                                AS
                                SELECT DISTINCT 
                                CASE 
                                WHEN ST.TAG_VALUE = AT.ALTR_TAG_VALUE THEN 
                                COALESCE(ST.TAG_NAME, '''') || ''.'' || COALESCE(ST.TAG_VALUE, '''')
                                ELSE 
                                ST.TAG_NAME
                                END AS SNOW_TAG_NM_VAL,
                                AT.ALTR_TAG_NAME_VAL, 
                                ST.TAG_NAME, 
                                CASE
                                WHEN ST.TAG_VALUE = AT.ALTR_TAG_VALUE THEN ST.TAG_VALUE
                                WHEN ST.TAG_VALUE IS NOT NULL AND AT.ALTR_TAG_VALUE IS NULL THEN ST.TAG_VALUE = NULL
                                ELSE NULL
                                END AS TAG_VALUE
                                FROM 
                                SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES ST
                                FULL JOIN 
                                ALTR_TAG_INFO AT
                                ON 
                                AT.ALTR_TAG_NAME_VAL = SNOW_TAG_NM_VAL
                                WHERE 
                                ST.TAG_DATABASE = '''|| DB_NAME ||'''';
                                
        EXECUTE IMMEDIATE :CREATE_STATEMENT;
    
        CREATE_STATEMENT := 'CREATE OR REPLACE TABLE SNOW_DB_INFO(
                                DATABASE_NAME STRING,
                                SCHEMA_NAME STRING,
                                TABLE_NAME STRING,
                                COLUMN_NAME STRING,
                                DB_ID STRING,
                                SNOW_TAG_NM_VAL STRING,
                                PRIMARY KEY (DB_ID),
                                FOREIGN KEY (SNOW_TAG_NM_VAL) REFERENCES SNOW_TAG_INFO(SNOW_TAG_NM_VAL),
                                FOREIGN KEY (DB_ID) REFERENCES ALTR_TOKENIZE_INFO(DB_ID)
                                )
                                AS
                                SELECT DISTINCT ST.OBJECT_DATABASE AS DATABASE_NAME, ST.OBJECT_SCHEMA AS SCHEMA_NAME, ST.OBJECT_NAME AS TABLE_NAME, ST.COLUMN_NAME,
                                COALESCE(ST.OBJECT_DATABASE,'''') || ''.'' || COALESCE(ST.OBJECT_SCHEMA,'''') || ''.'' || COALESCE(ST.OBJECT_NAME,'''') || ''.'' || COALESCE(COLUMN_NAME,'''') 
                                AS DB_ID,
                                STD.SNOW_TAG_NM_VAL
                                FROM SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES ST
                                FULL JOIN SNOW_TAG_INFO STD
                                ON STD.SNOW_TAG_NM_VAL = ST.TAG_NAME
                                WHERE TAG_DATABASE = '''|| DB_NAME ||'''';
        EXECUTE IMMEDIATE :CREATE_STATEMENT;
    ELSE
            CREATE_STATEMENT := 'CREATE OR REPLACE TABLE SNOW_TAG_INFO 
                            (SNOW_TAG_NM_VAL STRING,
                            ALTR_TAG_NAME_VAL STRING,
                            TAG_NAME STRING,
                            TAG_VALUE STRING,
                            PRIMARY KEY (SNOW_TAG_NM_VAL),
                            FOREIGN KEY (ALTR_TAG_NAME_VAL) REFERENCES ALTR_TAG_INFO(ALTR_TAG_NAME_VAL))
                            AS
                            SELECT DISTINCT COALESCE(ST.TAG_NAME,'''') || ''.'' || COALESCE(ST.TAG_VALUE,'''') AS SNOW_TAG_NM_VAL,AT.ALTR_TAG_NAME_VAL, TAG_NAME, TAG_VALUE
                            FROM SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES ST
                            FULL JOIN ALTR_TAG_INFO AT
                            ON AT.ALTR_TAG_NAME_VAL = SNOW_TAG_NM_VAL
                            WHERE TAG_DATABASE = '''|| DB_NAME ||'''';
                            
        EXECUTE IMMEDIATE :CREATE_STATEMENT;

    
        CREATE_STATEMENT := 'CREATE OR REPLACE TABLE SNOW_DB_INFO(
                            DATABASE_NAME STRING,
                            SCHEMA_NAME STRING,
                            TABLE_NAME STRING,
                            COLUMN_NAME STRING,
                            DB_ID STRING,
                            SNOW_TAG_NM_VAL STRING,
                            PRIMARY KEY (DB_ID),
                            FOREIGN KEY (SNOW_TAG_NM_VAL) REFERENCES SNOW_TAG_INFO(SNOW_TAG_NM_VAL),
                            FOREIGN KEY (DB_ID) REFERENCES ALTR_TOKENIZE_INFO(DB_ID)
                            )
                            AS
                            SELECT DISTINCT ST.OBJECT_DATABASE AS DATABASE_NAME, ST.OBJECT_SCHEMA AS SCHEMA_NAME, ST.OBJECT_NAME AS TABLE_NAME, ST.COLUMN_NAME,
                            COALESCE(ST.OBJECT_DATABASE,'''') || ''.'' || COALESCE(ST.OBJECT_SCHEMA,'''') || ''.'' || COALESCE(ST.OBJECT_NAME,'''') || ''.'' || COALESCE(COLUMN_NAME,'''') 
                            AS DB_ID,
                            STD.SNOW_TAG_NM_VAL
                            FROM SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES ST
                            FULL JOIN SNOW_TAG_INFO STD
                            ON STD.SNOW_TAG_NM_VAL = COALESCE(ST.TAG_NAME,''.'') || ''.'' || COALESCE (ST.TAG_VALUE,''.'')
                            WHERE TAG_DATABASE = '''|| DB_NAME ||'''';
        EXECUTE IMMEDIATE :CREATE_STATEMENT;
 
    END IF;
    RETURN 'Tables created successfully.';
END;

To verify the successful execution of the stored procedure, run the following stored procedure. This step ensures that all components are functioning as needed.

Make sure to mention the params:

  • Condition:

    • TRUE—if using tag name and tag values to mask the data in ALTR

    • FALSE—if using only tag name to mask the data in ALTR

  • DB_Name: Database name where the tags associated with ALTR is stored.

CALL ALTR_SF_METADATA_SP(<condition>,'<DB_NAME>');

Create Tasks to Automate All the Steps

Use the following SQL to create a single task that invokes the external functions and stored procedures established in the Snowflake environment. Automating the execution of the functions and procedures ensures timely creation of the required tables. This task is scheduled to run daily at 12:00 AM CST, but can be scheduled to run at a different time by updating Line 5. Use the database and schema that is created in Create and Use Database Schema.

To create tasks to automate all the steps, run the following query:

Note

Before running the query, update the following lines with your environment-specific details:

  • Line 1—<TASK_NAME>

  • Line 2—<WAREHOUSE_NAME>

  • Line 13—Params: <condition> -TRUE or FALSE; <DB_NAME>; (DB where Tags associated to ALTR are stored)

CREATE OR REPLACE TASK <TASK_NAME>
  WAREHOUSE = '<WAREHOUSE_NAME>'
  SCHEDULE = 'USING CRON 0 5 * * * UTC'  -- Runs daily at 5 AM UTC (midnight CDT)
AS
BEGIN
    -- Call your functions and procedures
    CALL ALTR_POLICY_EF();
    CALL ALTR_MONITOR_EF();
    CALL ALTR_TOKENIZE_EF();
    CALL ALTR_MONITORING_SP();
    CALL ALTR_TAG_INFO_SP();
    CALL ALTR_TOKENIZE_SP();
    CALL ALTR_SF_METADATA_SP(<condition>,'<DB_NAME>');
END;

To initiate the task, run the following query:

ALTER TASK <TASK_NAME> RESUME;

To suspend the task, run the following query:

ALTER TASK <TASK_NAME> SUSPEND;

There are two ways to set up a PowerBI connection to Snowflake:

  • Use our template for an automated set up. This is the preferred method.

  • Manually. The template assumes specific table and database names used in setting up this dashboard. If you have changed the table and/database name, manually connect PowerBI to Snowflake.

To set up the PowerBI Desktop:

  1. Download and install PowerBI Desktop.

  2. Launch PowerBI Desktop.

We've provided a template file to help you get started quickly.

To use the template to connect PowerBI to Snowflake:

  1. Download the template.

  2. Enter your Snowflake server name (e.g. yourorganization.snowflakecomputing.com) and warehouse details.

  3. Enter your Snowflake username and password to connect your data and to set up the security dashboard.

    Note

    Ensure that the database name, schema name and table names remain unchanged; the template file is specifically designed according to the naming conventions outlined in the previous steps.

Connect Power BI to Snowflake by using the Snowflake connector in Power BI Desktop. Learn more.

To import data from relevant Snowflake tables:

  1. Select tables for the Data Protection Metrics Dashboard.

    1. Select the relevant tables from your Snowflake schema.

    2. Click Load to import the tables into PowerBI.

    3. Select Import. To fetch data dynamically, select DirectQuery instead of Import.

  2. Verify your data model looks like the following in PowerBI:

    DPMD_VerifyDataModel.png
  3. Set up relationships. Navigate to the Model view in Power BI (found in the left-hand menu). Power BI automatically detects relationships between tables if they share common fields (like a primary key or foreign key).

  4. Create new relationships (if needed). If Power BI didn’t automatically detect relationships between your tables, manually create them by dragging a column from one table to its corresponding column in another table to define the relationship.

Create measures to enhance the functionality of the dashboard, enabling the development of relevant metrics for effective data protection analysis.

Counts of Blanks

To create a count of blanks metric, run the following query:

Count of Blanks = 
VAR SelectedTag = SELECTEDVALUE('ALTR_TAG_INFO'[ALTR_TAG_NAME])
VAR CountResult = 
    CALCULATE(
        COUNTA('SNOW_DB_INFO'[COLUMN_NAME]),
        ISBLANK('ALTR_TAG_INFO'[ALTR_TAG_NAME]),
        'ALTR_TAG_INFO'[ALTR_TAG_NAME] = SelectedTag
    )
RETURN
    IF(ISBLANK(CountResult), 0, CountResult)

Counts of Alerts

To create a count of alerts metric, run the following query:

Count of Alerts = 
VAR CountResult = CALCULATE(
    COUNTA('SNOW_DB_INFO'[COLUMN_NAME]),
    FILTER(
        ALL('ALTR_POLICY_TAG_INFO'), 
        NOT(ISBLANK('ALTR_POLICY_TAG_INFO'[ALERT_ID]))
    )
)
RETURN
IF(ISBLANK(CountResult),0,CountResult)

Count of Protected Columns

To create a count of protected columns metric, run the following query:

Count of Protected columns = 
VAR CountResult = CALCULATE(
    COUNTA('SNOW_DB_INFO'[COLUMN_NAME]),
    RELATEDTABLE('ALTR_POLICY_TAG_INFO')
)
RETURN
IF(ISBLANK(CountResult),0,CountResult)

If using the template to connect PowerBI to Snowflake, these charts and visualizations are automatically created.

If manually connecting PowerBI to Snowflake, create the following charts and visualizations using the corresponding tables listed. This step requires working knowledge of PowerBI.

Description: This bar chart visualizes the number of unprotected data assets (columns) categorized by tag name. It allows users to quickly identify which tag names correspond to the highest volumes of unprotected data, highlighting areas that require immediate attention. The immediate measure is to protect those unprotected columns in ALTR, ensuring that sensitive information is adequately secured and compliant with data protection protocols.

Tables: The chart is created using tables SNOW_DB_INFO and SNOW_TAG_INFO.

DPMD_BarChart1.png

Description: This bar chart displays the number of unprotected data assets (columns) categorized by tag values within a specific tag name. It enables users to identify which tag values are unprotected, highlighting areas that necessitate immediate action. The immediate step is to protect these unprotected columns in ALTR, ensuring that sensitive information is adequately secured and compliant with data protection protocols.

Tables: The chart is created using tables SNOW_DB_INFO and SNOW_TAG_INFO.

DPMD_BarChart2.png

Description: This gauge visualizes the number of unprotected data assets (columns) categorized by both tag name and tag values. It allows users to easily identify the number of columns that require protection in ALTR. The gauge includes alert features that notify users when the threshold reaches zero, indicating that a column tagged in Snowflake remains unprotected in ALTR. This prompt enables users to take immediate action. The primary objective is to secure these unprotected columns in ALTR, ensuring that sensitive information is adequately protected and compliant with data protection protocols.

Tables: The chart is created using measure Count of Blanks and table SNOW_DB_INFO.

DPMD_Gauge1.png

Here you can see that out of 149 columns there are 51 columns tagged in Snowflake that remain unprotected in ALTR.

Description: This gauge visualizes three key values: the number of columns being monitored in ALTR, the number of protected columns in ALTR, and the total number of columns tagged in Snowflake that require protection in ALTR. By comparing these values, users can easily identify gaps in monitoring coverage and protection efforts. This insight helps ensure that sensitive data is adequately secured and compliant with data protection protocols, allowing for prompt action to address any deficiencies.

Tables: The chart is created using measures Count of Alerts and Count of Protected Columns and table SNOW_DB_INFO.

DPMD_Gauge2.png

Here you can see there are 10 columns currently being monitored in ALTR and 98 columns that are protected. This reveals a gap of 88 columns where thresholds or alerts can still be assigned. Additionally, there are a total of 149 columns tagged in Snowflake on which policies can be applied.

Description: This bar chart illustrates the number of protected data assets (columns) categorized by tag name within ALTR. This allows for easy identification of any gaps or areas needing attention. The chart serves as a valuable tool for data governance, enabling stakeholders to ensure that all relevant data assets are adequately secured.

Tables: The chart is created using tables SNOW_DB_INFO and ALTR_TAG_INFO.

DPMD_BarChart3.png

Description: This bar chart displays the number of protected data assets (columns) categorized by tag values within a specific tag name. By providing a clear visualization of the data, users can easily identify which tag values are protected data in ALTR.

Tables: The chart is created using tables SNOW_DB_INFO and ALTR_TAG_INFO.

DPMD_BarChart4.png

Description: This gauge visualizes the number of protected data assets (columns) categorized by both tag name and tag values. It allows users to easily identify the number of columns that are protected in ALTR.

Tables: This chart is created using measure Count of Protected Columns and table SNOW_DB_INFO.

DPMD_Gauge3.png

Here you can see that out of 149 columns there are 98 columns tagged in Snowflake that are protected in ALTR.

Description: This gauge provides a visualization of the number of protected data assets (columns) that are marked as tokenized within ALTR.

Tables: The chart is created using table ALTR_TOKENIZE_INFO.

DPMD_Gauge4.png

Here you can see that out of 3 columns that are connected in ALTR, 1 column is tokenized.

To publish the dashboard:

  1. Click File in the top left corner and select Save As to save your Power BI file (.pbix).

  2. Click the Home tab and then select Publish.

  3. Select your Power BI workspace and click Select to upload your dashboard to Power BI Service.

To set up the dashboard in the PowerBI Service:

  1. Go to Power BI Service and navigate to your workspace.

  2. Click the report you published to open it.

  3. Pin visualizations to a new or existing dashboard for easier access.

The data protection metric dashboard is now fully set up and ready for analysis. This dashboard effectively highlights sensitive data that lacks adequate security controls or monitoring, serving as a critical tool for identifying gaps in data security. By leveraging this dashboard, organizations can take informed actions to strengthen their data security measures and ensure a robust defense against potential threats.