Skip to content

Commit

Permalink
prompty: expose the 'model' property for use in flex flows
Browse files Browse the repository at this point in the history
Useful in flex flows  for instance to adapt the external logic to some conditions
(.i.e if the api is chat or completion, if it is streaming or not, etc).
  • Loading branch information
ianchi committed Oct 19, 2024
1 parent f08e576 commit 9c963e4
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 0 deletions.
2 changes: 2 additions & 0 deletions src/promptflow-core/promptflow/core/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
AzureOpenAIModelConfiguration,
ModelConfiguration,
OpenAIModelConfiguration,
PromptyModelConfiguration
)

from ._version import __version__
Expand All @@ -29,6 +30,7 @@
"AsyncPrompty",
"ModelConfiguration",
"OpenAIModelConfiguration",
"PromptyModelConfiguration",
"AzureOpenAIModelConfiguration",
"__version__",
]
7 changes: 7 additions & 0 deletions src/promptflow-core/promptflow/core/_flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -377,6 +377,13 @@ def _parse_prompty(path):
config_content, prompt_template = result.groups()
configs = load_yaml_string(config_content)
return configs, prompt_template

@property
def model(self) -> PromptyModelConfiguration:
"""
Returns the parsed and resolved model configuration associated with this object.
"""
return self._model

def _resolve_inputs(self, input_values):
"""
Expand Down

0 comments on commit 9c963e4

Please sign in to comment.