Skip to main content

Looker

Requirements

warning

A Warehouse type integration must already be configured to complete the first ingestion of this integration.

To get things started with Looker in Catalog, you will need:

  • Credentials of a Looker admin
  • An admin Looker API key. Follow the steps below "Create a Looker API Key"
  • A Read-only access to your Looker git repository. Follow the steps below "Provide access to your LookML"

Related pages:

Catalog managed

Please:

  • Input your credentials directly in the App. We need:
  • Give us a read-only access to your Looker git repository. See how to provide access to your LookML code here

For your first sync, it will take up to 48h and we will let you know when it is complete ✅

If you are not comfortable giving us access to your credentials, please continue to #client-managed 👇

Client managed

Doing a one shot extract

For your trial, you can simply give us a one shot view of your BI tool's

To get things working quickly, here's a Google Colab to run swiftly our package.

Running the Extraction package

Install the PyPi package

pip install castor-extractor[looker]
info

For further details on our Extractor Pypi package: link

Running the PyPi package

Once the package has been installed, you should be able to run the following command in your terminal:

castor-extract-looker [arguments]

The script will run and display logs as following:

INFO - Extracting users from Looker API
INFO - POST(https://cloud.looker.com/api/4.0/login)
INFO - GET(https://cloud.looker.com/api/4.0/users/search)
INFO - Fetched page 1 / 7 results
INFO - GET(https://catalog.cloud.looker.com/api/4.0/users/search)
INFO - Fetched page 2 / 0 results


...

INFO - Wrote output file: /tmp/catalog/1649079699-projects.json
INFO - Wrote output file: /tmp/catalog/1649079699-summary.json

Credentials

  • -c, --client-id: API Key Client ID (mandatory)
  • -s, --client-secret: API Key Client Secret (mandatory)

Other arguments

  • -b, --base-url: Looker base url (mandatory)
  • -o, --output: Target folder to store the extracted files (mandatory)
  • -t, --timeout : Timeout (in s) parameter for Looker API
  • --log-to-stdout : Will write all log outputs to stdout instead of stderr

Specific export methods for looks and dashboards

  • --search-per-folder : Will export looks and dashboards per folder using multithreading (see below argument)
  • --thread-pool-size : Number of parallel threads, defaults to 20
info

You can also get help with argument --help

Scheduling and Push to Catalog

When moving out of trial, you'll want to refresh your Looker content in Catalog. Here is how to do it:

The Catalog team will provide you with

  1. Catalog Identifier (an id for us to match your looker files with your Catalog instance)
  2. Catalog Token An API Token

You can then use the castor-upload command:

castor-upload [arguments]

Arguments

  • -k, --token: Token provided by Catalog
  • -s, --source_id: account id provided by Catalog
  • -t, --file_type: source type to upload. Currently supported are 0

Target files

To specify the target files, provide one of the following:

  • -f, --file_path: to push a single file

or

  • -d, --directory_path: to push several files at once (*)
warning

(*) The tool will upload all files included in the given directory.

Make sure it contains only the extracted files before pushing.

Then you'll have to schedule the script run and the push to Catalog, use your preferred scheduler to create this job

You're done!