Skip to main content

Integrating Databricks

Capabilities

DataGrail's Databricks integration provides the following capabilities:

ProductCapability
Request Manager
Request TypesAccess, Deletion, Do Not Sell/Share, Identifiers
Identifier CategoriesAny
Live Data MapData Discovery

Before You Start

To successfully configure this integration, please ensure you have sufficient privileges:

  • DataGrail User Role: Super Admin, Connections Manager
  • Databricks User Role: Admin
  • Secrets Manager: Write Access

Connecting with RM Agent

The Request Manager Agent allows you to automate Data Subject Requests by connecting to internal systems within your network, and without requiring ingress from the public network.

The Agent connects to your Databricks instance using least privileged credentials that you will create and store in a secrets manager. When configuring the Databricks integration in DataGrail, only the location of that secrets manager entry will be referenced (e.g., AWS Secrets Manager ARN), which ensures that no secrets are shared directly with DataGrail.

Prerequisites

Before you can connect to Databricks, ensure the following:

  • RM Agent is deployed and connected in DataGrail.
  • Network is configured to allow the Agent to connect with the Databricks instance.

Create and Store Credentials

  1. In Databricks, create a new user for the agent. Only grant the minimum necessary permissions for executing the request.

  2. Configure the following JSON key-value pairs:

    {
    "host": "<Databricks instance host name>",
    "http_path": "<HTTP path to DBSQL endpoint (e.g. /sql/1.0/endpoints/...) or to a DBR interactive cluster (e.g. /sql/protocolv1/o/...)>",
    "access_token": "<HTTP Bearer access token, e.g. Databricks Personal Access Token>"
    }
  3. Store the JSON value in your secrets manager with an entry name like datagrail.rm-agent.databricks.

  4. Ensure that the agent is configured to retrieve the value of this secrets manager entry.

Add the Agent Integration

  1. In DataGrail, navigate to Agents and select your Agent.
  2. In the top right, select Add New Integration and search for Databricks.
  3. Under Enabled Capabilities and Enabled Identifiers, select only those that will be used for this integration.
  4. Enter the Credentials Location (e.g. AWS Secrets Manager ARN).
  5. Select the Data Retrieval behavior for deletion requests.
    warning

    When using Retrieve Data, the data reviewed may not be exactly what is deleted due to the access and deletion logic executing separately!

  6. Under Agent Query Configuration, add request logic to be executed within Databricks for all enabled request types.
    Query Parameter Format

    Use named paramstyle when formatting query parameters with identifiers (e.g., email, user_id).

    Example:

    SELECT * FROM users WHERE email = :email
  7. Finally, select Configure Integration. Wait a few moments to ensure that the connection is successful. For failed connections, review the Agent container logs for additional details.

Connecting with RDD Agent

The Responsible Data Discovery Agent allows you to securely perform data classification by connecting to internal systems within your network, and without requiring ingress from the public network.

For the Agent to scan your Databricks instance, read-only credentials are created and stored in a vault on your network. When configuring the Databricks integration in DataGrail, only the location of that vault entry will be referenced (e.g., AWS Secrets Manager ARN), which ensures that no secrets are shared directly with DataGrail.

Before Connecting

In order to start scanning Databricks, ensure the following:

  • RDD Agent is deployed and connected in DataGrail.
  • Network is configured to allow the Agent to connect with the Databricks instance.

Create and Store Credentials

  1. In Databricks, create a new read-only user (e.g., datagrail-rdd-agent). Consult your preferred Databricks documentation as needed.

  2. Configure the following JSON key-value pairs:

    {
    "host": "<Databricks instance host name>",
    "http_path": "<HTTP path to DBSQL endpoint (e.g. /sql/1.0/endpoints/...) or to a DBR interactive cluster (e.g. /sql/protocolv1/o/...)>",
    "access_token": "<HTTP Bearer access token, e.g. Databricks Personal Access Token>"
    }
  3. Store the JSON value in your vault with an entry name like datagrail-rdd-agent-databricks.

  4. Ensure that the agent is configured to retrieve the value of this vault entry.

Add the Agent Integration

  1. In DataGrail, navigate to Agents under Integration network.
  2. Select your Agent.
  3. In the top right, select Add New Integration.
  4. Search for Databricks, then select Configure.
  5. Enter an Integration Name, and only enable the Data Discovery capability.
  6. Enter the Credentials Location (e.g. AWS Secrets Manager ARN).
  7. (optional) Choose the Business Processes, Region, and System Location.
  8. Finally, select Configure Integration. Wait a few moments to ensure that the connection is successful. For failed connections, review the Agent container logs for additional details.

Troubleshooting

If you are unable to successfully connect the integration, review these common troubleshooting steps:

Agent Unable to Connect to Databricks
  1. Verify that the network is configured to allow the Agent to connect with the Databricks instance.
  2. Verify the Agent has permissions to access the Databricks credentials stored in your vault.
Agent is Not Connected in DataGrail
  1. Confirm that the Agent is running, and logs do not indicate any errors.
  2. The DataGrail API Key used by the Agent is valid and not expired.
  3. The Agent has permissions to access the DataGrail API Key stored in your vault.
  4. Network egress is permitted from the Agent to your DataGrail domain.

Technical Details

Access TypeSynchronous
Deletion TypeSynchronous (Whole Record)
Opt Out TypeSynchronous

 

Need help?
If you have any questions, please reach out to your dedicated Account Manager or contact us at support@datagrail.io.

Disclaimer: The information contained in this message does not constitute as legal advice. We would advise seeking professional counsel before acting on or interpreting any material.