Skip to main content

Preset

Overview

Preset is a business intelligence and analytics platform. Learn more in the official Preset documentation.

The DataHub integration for Preset covers BI entities such as dashboards, charts, datasets, and related ownership context. Depending on module capabilities, it can also capture features such as lineage, usage, profiling, ownership, tags, and stateful deletion detection.

Concept Mapping

While the specific concept mapping is still pending, this shows the generic concept mapping in DataHub.

Source ConceptDataHub ConceptNotes
Platform/account/project scopePlatform Instance, ContainerOrganizes assets within the platform context.
Core technical asset (for example table/view/topic/file)DatasetPrimary ingested technical asset.
Schema fields / columnsSchemaFieldIncluded when schema extraction is supported.
Ownership and collaboration principalsCorpUser, CorpGroupEmitted by modules that support ownership and identity metadata.
Dependencies and processing relationshipsLineage edgesAvailable when lineage extraction is supported and enabled.

Module preset

Certified

Important Capabilities

CapabilityStatusNotes
Detect Deleted EntitiesEnabled by default via stateful ingestion.
DomainsEnabled by domain config to assign domain_key.
Extract TagsSupported by default.
Table-Level LineageSupported by default.

Overview

The preset module ingests metadata from Preset into DataHub. It is intended for production ingestion workflows and module-specific capabilities are documented below.

Prerequisites

Before running ingestion, ensure network connectivity to the source, valid authentication credentials, and read permissions for metadata APIs required by this module.

Install the Plugin

pip install 'acryl-datahub[preset]'

Starter Recipe

Check out the following recipe to get started with ingestion! See below for full configuration options.

For general pointers on writing and running a recipe, see our main recipe guide.

source:
type: preset
config:
# Coordinates
connect_uri: Preset workspace URL
manager_uri: https://api.app.preset.io

# Credentials
api_key: API key
api_secret: API secret
database_alias:
example_name_1: business_name_1
example_name_2: business_name_2

sink:
# sink configs

Config Details

Note that a . is used to denote nested fields in the YAML recipe.

FieldDescription
api_key
One of string(password), null
Preset.io API key.
Default: None
api_secret
One of string(password), null
Preset.io API secret.
Default: None
connect_uri
string
Preset workspace URL.
Default:
database_alias
map(str,string)
display_uri
One of string, null
optional URL to use in links (if connect_uri is only for ingestion)
Default: None
ingest_charts
boolean
Enable to ingest charts.
Default: True
ingest_dashboards
boolean
Enable to ingest dashboards.
Default: True
ingest_datasets
boolean
Enable to ingest datasets.
Default: False
manager_uri
string
Preset.io API URL
max_threads
integer
Max parallelism for API calls. Defaults to cpuCount or 40
options
object
Default: {}
password
One of string(password), null
Superset password.
Default: None
platform_instance
One of string, null
The instance of the platform that all assets produced by this recipe belong to. This should be unique within the platform. See https://docs.datahub.com/docs/platform-instances/ for more details.
Default: None
provider
string
Superset provider.
Default: db
timeout
integer
Timeout of single API call to superset.
Default: 10
username
One of string, null
Superset username.
Default: None
env
string
Environment to use in namespace when constructing URNs
Default: PROD
chart_pattern
AllowDenyPattern
A class to store allow deny regexes
chart_pattern.ignoreCase
One of boolean, null
Whether to ignore case sensitivity during pattern matching.
Default: True
dashboard_pattern
AllowDenyPattern
A class to store allow deny regexes
dashboard_pattern.ignoreCase
One of boolean, null
Whether to ignore case sensitivity during pattern matching.
Default: True
database_pattern
AllowDenyPattern
A class to store allow deny regexes
database_pattern.ignoreCase
One of boolean, null
Whether to ignore case sensitivity during pattern matching.
Default: True
dataset_pattern
AllowDenyPattern
A class to store allow deny regexes
dataset_pattern.ignoreCase
One of boolean, null
Whether to ignore case sensitivity during pattern matching.
Default: True
domain
map(str,AllowDenyPattern)
A class to store allow deny regexes
domain.key.allow
array
List of regex patterns to include in ingestion
Default: ['.*']
domain.key.allow.string
string
domain.key.ignoreCase
One of boolean, null
Whether to ignore case sensitivity during pattern matching.
Default: True
domain.key.deny
array
List of regex patterns to exclude from ingestion.
Default: []
domain.key.deny.string
string
stateful_ingestion
One of StatefulStaleMetadataRemovalConfig, null
Preset Stateful Ingestion Config.
Default: None
stateful_ingestion.enabled
boolean
Whether or not to enable stateful ingest. Default: True if a pipeline_name is set and either a datahub-rest sink or datahub_api is specified, otherwise False
Default: False
stateful_ingestion.fail_safe_threshold
number
Prevents large amount of soft deletes & the state from committing from accidental changes to the source configuration if the relative change percent in entities compared to the previous state is above the 'fail_safe_threshold'.
Default: 75.0
stateful_ingestion.remove_stale_metadata
boolean
Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.
Default: True

Capabilities

Use the Important Capabilities table above as the source of truth for supported features and whether additional configuration is required.

Database alias

If you were using database_alias in one of your other ingestions to rename your databases to something else based on business needs you can rename them in superset also

source:
type: preset
config:
# Coordinates
connect_uri: Preset workspace URL
manager_uri: https://api.app.preset.io

# Credentials
api_key: API key
api_secret: API secret
database_alias:
example_name_1: business_name_1
example_name_2: business_name_2

sink:
# sink configs

Limitations

Module behavior is constrained by source APIs, permissions, and metadata exposed by the platform. Refer to capability notes for unsupported or conditional features.

Troubleshooting

If ingestion fails, validate credentials, permissions, connectivity, and scope filters first. Then review ingestion logs for source-specific errors and adjust configuration accordingly.

Code Coordinates

  • Class Name: datahub.ingestion.source.preset.PresetSource
  • Browse on GitHub
Questions?

If you've got any questions on configuring ingestion for Preset, feel free to ping us on our Slack.

💡 Contributing to this documentation

This page is auto-generated from the underlying source code. To make changes, please edit the relevant source files in the metadata-ingestion directory.

Tip: For quick typo fixes or documentation updates, you can click the ✏️ Edit icon directly in the GitHub UI to open a Pull Request. For larger changes and PR naming conventions, please refer to our Contributing Guide.