Metadata not set (bug or user error?) on partitioned asset AssetMaterialization event? #23546
Unanswered
dboeckenhoff
asked this question in
Q&A
Replies: 1 comment
-
hey @dboeckenhoff manually logging an asset materialization event should only be done in ops. All assets automatically log an asset materialization event when they complete, so what's happening is that that automatic materialization event is overriding the one you logged, hence the loss of metadata. Try logging just the metadata instead @asset(partitions_def=partitions_def)
def programs(context, program_api: ProgramAPI):
partition_key = context.partition_key
programs = program_api.fetch_programs()
context.log.info(f"Partion key: {partition_key}")
for program in programs:
if program["id"] == partition_key:
context.log.info(f"Materializing program: {program}")
context.add_output_metadata(
{"test_string": "test", "program": program},
)
return program
context.log.error("No matching program found")
return None This should work for you |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone,New to dagster and I came up (together with your ai scout) with this mwe (which is still missing a lot of features).
What I try to accomplish is to trigger a graph job execution whenever a partition of my asset is materialized.
I use a AssetMaterialization event and its metadata for that (proposed by the ask-ai).
The
test_asset_sensor
does not pass because the metadata of theAssetMaterialization
event seem to be dropped before they are passed to theprogram_asset_sensor
. The metadata dict is empty.I am happy for all kinds of comments and advice (also not related to the question above).
Cheers!
Beta Was this translation helpful? Give feedback.
All reactions