Support for the Dagster step selection DSL: reexecute_pipeline now takes step_selection, which accepts queries like *solid_a.compute++ (i.e., solid_a.compute, all of its ancestors, its immediate descendants, and their immediate descendants). steps_to_execute is deprecated and will be removed in 0.10.0.
Community contributions
[dagster-databricks] Improved setup of Databricks environment (Thanks @sd2k!)
Fixed a bug that pipeline-level hooks were not correctly applied on a pipeline subset.
Improved error messages when execute command can't load a code pointer.
Fixed a bug that prevented serializing Spark intermediates with configured intermediate storages.
Dagit
Enabled subset reexecution via Dagit when part of the pipeline is still running.
Made Schedules clickable and link to View All page in the schedule section.
Various Dagit UI improvements.
Experimental
[lakehouse] Added CLI command for building and executing a pipeline that updates a given set of assets: house update --module package.module —assets my_asset*
When using the configured API on a solid or composite solid, a new solid name must be provided.
The image used by the K8sScheduler to launch scheduled executions is now specified under the “scheduler” section of the Helm chart (previously under “pipeline_run” section).
New
Added an experimental mode that speeds up interactions in dagit by launching a gRPC server on startup for each repository location in your workspace. To enable it, add the following to your dagster.yaml:
opt_in:local_servers:true
Intermediate Storage and System Storage now default to the first provided storage definition when no configuration is provided. Previously, it would be necessary to provide a run config for storage whenever providing custom storage definitions, even if that storage required no run configuration. Now, if the first provided storage definition requires no run configuration, the system will default to using it.
Added a timezone picker to Dagit, and made all timestamps timezone-aware
Added solid_config to hook context which provides the access to the config schema variable of the corresponding solid.
Hooks can be directly set on PipelineDefinition or @pipeline, e.g. @pipeline(hook_defs={hook_a}). It will apply the hooks on every single solid instance within the pipeline.
Added Partitions tab for partitioned pipelines, with new backfill selector.