Skip to main content
Version: Next

v0.3.13

info

This contains detailed release notes, but there's also an announcement blog post that covers the highlights.

Release Availability Date

31-Jul-2025

  • CLI/SDK: 1.2.0.1
  • Remote Executor: v0.3.13-acryl (recommended), v0.3.12.4-acryl, v0.3.11.1-acryl
  • On-Prem Versions:
    • Helm: 1.5.68
    • API Gateway: 0.5.3
    • Actions: 0.0.13

Release Changelog

This release includes all changes up to and including DataHub Core v1.2.0.

  • Breaking Changes

    • All DataHub Python packages now require Python 3.9+. This affects the following packages:
      • acryl-datahub (DataHub CLI and SDK)
      • acryl-datahub-actions
      • acryl-datahub-airflow-plugin
      • acryl-datahub-prefect-plugin
      • acryl-datahub-gx-plugin
      • acryl-datahub-dagster-plugin (already required Python 3.9+)
    • #13619: The acryl-datahub-airflow-plugin has dropped support for Airflow versions less than 2.7.
    • #14054: The v1 plugin in acryl-datahub-airflow-plugin has been removed. The v2 plugin has been the default for a while already, so this should not impact most users. Users who were explicitly setting DATAHUB_AIRFLOW_PLUGIN_USE_V1_PLUGIN=true will need to either upgrade or pin to an older version to continue using the v1 plugin.
    • #14015: In the sql-queries source, the default_dialect configuration parameter has been renamed to override_dialect. This also affects the Python SDK methods:
      • DataHubGraph.parse_sql_lineage(default_dialect=...)DataHubGraph.parse_sql_lineage(override_dialect=...)
      • LineageClient.add_lineage_via_sql(default_dialect=...)LineageClient.add_lineage_via_sql(override_dialect=...)
    • #14059: The acryl-datahub-gx-plugin now requires pydantic v2, which means the effective minimum supported version of GX is 0.17.15 (from Sept 2023).
    • #13601: The use_queries_v2 flag is now enabled by default for Snowflake and BigQuery ingestion. This improves the quality of lineage and quantity of queries extracted.
  • Product

    • DataHub AI in Slack: The AI-powered @DataHub Slack command is now available in public beta. Admins can enable this feature by navigating to UI → Settings → AI.
    • Customizable Home Page: Introducing a brand new home page experience with customization to suit your personal or organizational needs! Currently in private beta behind a feature flag, this new home page allows users with permission to create or edit modules for a custom default experience for all users in your organization. Alternatively, users can individually update their own personal home page to suit their needs. Configure custom asset collections, hierarchy views, documentation, pinned links and more!
    • Entity Profile Design Updates: Entity profile pages receive a tabs design uplift in this release with sleeker, simpler-looking tabs that bring a more consistent feel to the page.
    • Access Worfklows: Introducing support for creating access approval workflows with custom entry points, custom form fields, routing policies, and more using the upsertActionWorkflow GraphQL API. Also introduced support for creating & reviewing access workflows via the Task Center. This is in private beta currently, available behind a feature flag (ACTION_WORKFLOWS_ENABLED).
    • Bulk Create Field Metric Smart Assertions: When creating a field metric assertion, you now have the ability to 'Bulk create smart assertions'. This allows you to select multiple fields and metrics, and spin up anomaly monitors across all of them in one go
    • Bulk Create Freshness and Volume Smart Assertions: On the data health page you can now create smart freshness and volume assertions across thousands of tables in one go. Makes it effortless to strap a seatbelt with anomaly monitors across your landscape.
    • Improved Notifications for Assertion Failures: Slack and email alerts for assertions failures will now include context around expected vs actual values, making it easier to separate signal from noise right where you work.
    • Assertion Notes: Add notes to assertions, capturing troubleshooting tips and other critical context for data producers who are responsible for maintaining the quality of the checks.
    • Floors and Ceilings for Metric Predictions: Smart assertions on metrics like volume and null percentage now have ceilings and floors to accurately capture the absolute limits of the metrics
    • Preview exclusion windows in assertions timeline: The historical assertions timeline viz now displays exclusion windows that were applied to filter out bad training data
    • Container filters on Data Health dashboard: Filter your data health dashboard by the asset's container, making it easy to see health of specific schemas in your database.
    • Data Health Filters reflected in URL: This makes it easy to bookmark and share links to specific filtered pages on the Data Health dashboard.
    • MCP Server: The search tool has been revamped to improve LLM understanding and reduce tool confusion / tool call error by ~60%.
    • AI-Generated Documentaton: We can now generate docs for tables with up to 3000 columns, increasing the previous limit of 1000.
  • Ingestion

    • For Snowflake and BigQuery, query extraction v2 has been enabled by default. This feature has been validated in beta for 6+ months, and improves the quality of lineage and usage, the quantity of queries extracted, and the overall ingestion performance.
  • Platform

  • Bug Fixes

    • Search Filters - Allow searching for values on structured property filters outside of the initial set of provided values from the server. The limitation here is that your search must include the full value you are looking for in order to filter on it.
    • Freshness smart assertions have been updated to look at when operations were actually applied on the source data. Previously they used the timestamp when the operation metadata was ingested into DataHub. This is a critical fix for Freshness Smart Assertions.
    • The UI to create smart assertions for Views now defaults to Query the view for row count. Previously it looked at the warehouse's info schema by default, which always errored since info schemas don't capture metrics on Views.

Known Issues