Skip to main content

Audit Logging

The following table outlines the expected near-real delivery times for ALTR’s audit logs:

Audit Log

Expected Delivery Time

Query audit

60 seconds maximum once query has completed

Custom audit

60 seconds maximum from when the custom audit API is triggered

 

ALTR introduces a maximum of 60 seconds of buffering. Any additional time spent is either preparing or delivering the audit. This buffering period is overridden if the size exceeds the maximum limit of 64MB.

ALTR monitors audit logs and if the expected delivery time exceeds 5 minutes, our Cloud Operation teams are alerted.

ALTR's Query Audit Logs maintain a record of all access to sensitive data protected by ALTR.

Whenever a user or application runs a SELECT query against a column or tag connected to ALTR, ALTR automatically generates a rich event log including information regarding who ran the query, what sensitive data was accessed by the query, how much sensitive data was accessed by the query, and which ALTR access policies affected the query results. The specific information included in the log depends on the type of query as well the type of data source where the query was executed.

Query Audit Logs are available in near-real-time, typically seconds or minutes after a query is complete and all required information regarding that query is available in the connected data source.

Caution

For Snowflake Data Sources, ALTR's Service User must have the MONITOR grant for the warehouse used to execute the query, otherwise a Query Audit log will not be generated.

To view Query Audit Logs, select Audit LogsQuery Log in the Navigation menu or view it in ALTR's Amazon S3 Log Export.

System Audit Logs enable ALTR administrators to track major administrative actions taken in the ALTR platform.

Whenever an ALTR administrator performs a key action in the ALTR platform, an audit log is generated, which information on which user took the action, what action was taken, and what ALTR attributes were changed. The following events are included in ALTR System Audits Logs:

  • Connect, disconnect, and modify data sources ("Databases")

  • Connect, disconnect, and modify column ("Data")

  • Create, remove, and update Column Access Policies ("Locks")

  • Create, remove, and update Thresholds

  • Alerts being triggered by query activity against Thresholds

  • Resolve and edit Alerts

  • Create, remove, and update ALTR administrators

  • Create, remove, and update API keys

  • Create, remove, and update Applications

  • Create, remove, and update User Groups

System audits are accessible via ALTR's AWS S3 Audit Log Export. To enable the export of System Audit Logs, contact ALTR Support.

Systems audit logs are not currently available in ALTR's UI.

Custom Audit Logs enable ALTR customers to extend ALTR's logging to include custom events, enabling bespoke logging and reporting use cases. For instance, custom audit logs can be used to track login events for a particular data source.

How to Configure and Use Custom Audit Logs

Custom audit logs are triggered by sending event information from a Snowflake User-Defined Function to a Snowflake External Function. See ALTR's Snowflake Integration documentation for more information on how ALTR integrates with Snowflake. These functions are created automatically for newly connected Snowflake accounts. If your Snowflake account was connected to ALTR prior to January 2024, please contact support@altr.com for instructions to configure custom audit logs.

Events can be submitted to ALTR manually through SQL or automatically though processes such as Snowflake tasks. Submitting events to ALTR involves calling the user-defined function with a batch of records. ALTR requires certain information for each event, including:

  • Event Type (This is a customer-defined and is used to distinguish between different kinds of events).

  • Content Type (ALTR currently supports ‘application/json’).

  • Event Details (This is a JSON object describing the event. It can include any information relevant to a customer’s use case).

Optionally, customers can send an event time. The format of this timestamp is milliseconds from epoch. If an event time is not specified, ALTR will use the time the event was submitted.

There is a 256kb limit to the size of custom audit submissions. If you have more than 256kb worth of data to send, consider breaking the information down into smaller requests.

Accessing Custom Audit Logs

Custom Audit Logs are accessible via ALTR’s Amazon S3 Log Export. To enable exporting of custom audit logs to an S3 bucket, contactALTR Support.

ALTR indexes events in S3 based on the event time. This is typically the customer-supplied time, if available. If a customer-supplied time is not provided or the customer-supplied time is more than 3 days in the past, events are indexed based on when they were submitted to ALTR.

Example SQL for Submitting Custom Audit Logs

Below is example SQL for a Snowflake Task that records events from Snowflake’s Login History every hour. See Snowflake’s documentation for more information on using tasks.

CREATE OR REPLACE TASK LOGIN_AUDIT_TASK
  SCHEDULE = '60 MINUTE'
AS
    SELECT "ALTR_DSAAS_DB"."{YOUR_SCHEMA_NAME}"."{YOUR_FUNCTION_NAME}"(select ARRAY_AGG("event_details") WITHIN GROUP (ORDER BY "event_details" ASC) as "audit_array" from (select TO_JSON(OBJECT_CONSTRUCT_KEEP_NULL( 'event_details', "custom_audit_blob", 'event_time', "event_time", 'content_type', "content_type", 'event_type', "event_type"))  as "event_details" from (SELECT (TO_VARIANT(OBJECT_CONSTRUCT_KEEP_NULL(
      'event_type', event_type,
      'user_name', user_name,
      'client_ip', client_ip,
      'first_authentication_factor', first_authentication_factor,
      'second_authentication_factor', second_authentication_factor,
      'is_success', is_success,
      'error_code', error_code,
      'error_message', error_message,
      'event_time', event_timestamp
    ))) "custom_audit_blob",
     date_part('EPOCH_MILLISECOND',audit_table.event_timestamp) as "event_time" ,
    'LOGIN' as "event_type",
    'application/json' as "content_type"
    FROM TABLE(information_schema.login_history(TIME_RANGE_START => dateadd('hours', -1, CURRENT_TIMESTAMP()), CURRENT_TIMESTAMP())) audit_table
    ORDER BY event_timestamp DESC)));

-- The query below starts the automated task
ALTER TASK LOGIN_AUDIT_TASK RESUME;
				

The Schema and Function name should be replaced with the schema and function automatically created by ALTR in your Snowflake account. They can be found using the queries below.

CREATE OR REPLACE TASK LOGIN_AUDIT_TASK
  SCHEDULE = '60 MINUTE'
AS
    SELECT "ALTR_DSAAS_DB"."{YOUR_SCHEMA_NAME}"."{YOUR_FUNCTION_NAME}"(select ARRAY_AGG("event_details") WITHIN GROUP (ORDER BY "event_details" ASC) as "audit_array" from (select TO_JSON(OBJECT_CONSTRUCT_KEEP_NULL( 'event_details', "custom_audit_blob", 'event_time', "event_time", 'content_type', "content_type", 'event_type', "event_type"))  as "event_details" from (SELECT (TO_VARIANT(OBJECT_CONSTRUCT_KEEP_NULL(
      'event_type', event_type,
      'user_name', user_name,
      'client_ip', client_ip,
      'first_authentication_factor', first_authentication_factor,
      'second_authentication_factor', second_authentication_factor,
      'is_success', is_success,
      'error_code', error_code,
      'error_message', error_message,
      'event_time', event_timestamp
    ))) "custom_audit_blob",
     date_part('EPOCH_MILLISECOND',audit_table.event_timestamp) as "event_time" ,
    'LOGIN' as "event_type",
    'application/json' as "content_type"
    FROM TABLE(information_schema.login_history(TIME_RANGE_START => dateadd('hours', -1, CURRENT_TIMESTAMP()), CURRENT_TIMESTAMP())) audit_table
    ORDER BY event_timestamp DESC)));

-- The query below starts the automated task
ALTER TASK LOGIN_AUDIT_TASK RESUME;