Skip to main content

Tableau

Integrate Tableau with the Catalog to sync dashboards, data sources, and metadata.

Requirements

Warehouse Required

A Warehouse type integration must already be configured to complete the first ingestion of this integration.

For all Tableau clients:

  • Tableau admin credentials

For Tableau Server clients only:

Tableau Server Version

Clients using Tableau Server version < 2019.3 will not have lineage to tables available.

On-Premise Support

For Tableau on-premises, both Catalog managed and client managed work.

MFA and PAT

If you have MFA enabled for your account, you will need to use the Tableau personal access token (PAT) option.

Whitelist Catalog IP

Here are our fixed IPs:

Self-Hosted Access

Self-hosted repositories must be accessible from our IP over the public internet for Catalog managed integrations.

Catalog Managed

You can upload your credentials directly in the App when creating your Tableau integration.

We need a Tableau personal access token (PAT):

{
"serverUrl": "https://something.tableau.com/",
"siteId": "something",
"tokenName": "catalog",
"token": "abcdefgh"
}

serverUrl: Tableau base URL, your API endpoint, usually your Tableau URL homepage. For example: <https://eu-west-1a.online.tableau.com> siteId: The Tableau Server site you are authenticating with. For example in the site URL http://MyServer/#/site/MarketingTeam/projects, the site name is MarketingTeam.

Default Site

This value can be omitted to connect with the Default site on the server. Use the following format to input your credentials:

{
"serverUrl": "https://something.tableau.com/",
"siteId": "Default",
"tokenName": "catalog",
"token": "abcdefgh"
}

For your first sync, it will take up to 48 h and we will let you know when it is complete.

If you are not comfortable giving us access to your credentials, please continue to Client managed below.

Client Managed

Doing a One Shot Extract

For your trial, you can simply give us a one shot view of your BI tool.

To get things working quickly, here's a Google Colab to run our package.

Running the Extraction Package

Install the PyPI Package

pip install castor-extractor[tableau]

For further details, see the castor-extractor PyPI page.

Run the PyPI Package

Once the package has been installed, you should be able to run the following command in your terminal:

castor-extract-tableau [arguments]

The script will run and display logs as following:

INFO - Logging in using user and password authentication
INFO - Signed into https://eu-west-1a.online.tableau.com as user with id ****
INFO - Extracting USER from Tableau API
INFO - Fetching USER
INFO - Querying all users on site

...

INFO - Wrote output file: /tmp/catalog/1649078755-custom_sql_queries.json
INFO - Wrote output file: /tmp/catalog/1649078755-summary.json

Credentials

You can sign in using the following method:

  • Tableau personal access token (PAT)
    • -n, --token-name: Tableau token name
    • -t, --token: Tableau token

Other Arguments

  • -b, --server-url: Tableau base URL, your API endpoint, usually your Tableau URL homepage. For example: <https://eu-west-1a.online.tableau.com>
  • -i, --site-id: Tableau Site ID, is empty if your site is the default one
  • -o, --output: target folder to store the extracted files
Help Argument

You can also get help with argument --help.

Scheduling and Push to Catalog

When moving out of trial, you'll want to refresh your Tableau content in the Catalog. Here is how to do it:

  • Your source id provided by Catalog, referred as source_id in the code examples
  • Your Catalog Token given by Catalog

We recommend using the castor-upload command:

castor-upload [arguments]

The Catalog team will provide you with:

  1. Catalog Identifier (an id for us to match your Tableau files with your Catalog instance)
  2. Catalog Token An API Token

You can then use the castor-upload command:

castor-upload [arguments]

Arguments

  • -k, --token: Token provided by Catalog
  • -s, --source_id: account id provided by Catalog
  • -t, --file_type: source type to upload. Currently supported are 0

Target Files

To specify the target files, provide one of the following:

  • -f, --file_path: to push a single file
  • -d, --directory_path: to push several files at once
Directory Contents

The tool will upload all files included in the given directory.

Make sure it contains only the extracted files before pushing.

Then you'll have to schedule the script run and the push to the Catalog. Use your preferred scheduler to create this job.