-
Notifications
You must be signed in to change notification settings - Fork 305
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add new attribute 'citations' to ConversationMessage type #193
base: main
Are you sure you want to change the base?
Conversation
Hi @hghandri |
Great idea ! I will work on it. By the way, I’ve made some changes to integrate the usage of the new version of ConversationMessage. |
This is a good start: usage and latency |
@brnaba-aws : I’ve made some changes to the Bedrock LLM agent. What do you think? Is this the right approach expected ? |
… used to handle additional metadata related to a response, such as token consumption.
…bedrock LLM and OpenAI
eb084b6
to
6bb8e1d
Compare
@brnaba-aws : I've just pushed a last update of source, could you check ? |
Hello @brnaba-aws , @cornelcroi Any updates on this PR ? We need it for our internal project. |
@hghandri |
latencyMs: int | ||
|
||
class ConversationMessageMetadata(TypedDict): | ||
citations: List[str] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should make citations a bit more generic. Like Document in Langchain. At the end it is a file, location, and text
|
||
@dataclass | ||
class ConversationMessageMetadata: | ||
citations: List[str] = field(default_factory=list) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not sufficient. We need also to account for document name, location and page within the document
@@ -23,10 +23,13 @@ class AgentResponse: | |||
|
|||
|
|||
class AgentCallbacks: | |||
def on_llm_new_token(self, token: str) -> None: | |||
def on_llm_new_token(self, message: ConversationMessage) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do not change this, it is already been taken care in another PR
# Default implementation | ||
pass | ||
|
||
def on_llm_end(self, token: ConversationMessage) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove that, it is already been taken care in another PR
@@ -138,7 +138,12 @@ async def process_request( | |||
decoded_response = chunk['bytes'].decode('utf-8') | |||
|
|||
# Trigger callback for each token (useful for real-time updates) | |||
self.callbacks.on_llm_new_token(decoded_response) | |||
self.callbacks.on_llm_new_token( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
revert this
@@ -205,7 +205,12 @@ async def handle_streaming_response(self, input) -> Any: | |||
async with self.client.messages.stream(**input) as stream: | |||
async for event in stream: | |||
if event.type == "text": | |||
self.callbacks.on_llm_new_token(event.text) | |||
self.callbacks.on_llm_new_token( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
revert this
elif 'text' in delta: | ||
text += delta['text'] | ||
self.callbacks.on_llm_new_token(delta['text']) | ||
self.callbacks.on_llm_new_token( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
revert this
metrics=chunk['metadata']['metrics'] | ||
) | ||
|
||
print('generate message stream :', message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove print
) | ||
|
||
except Exception as error: | ||
print(traceback.print_exc()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove print
Issue number:
Summary
Update the type "ConversationMessage" to include a "citations" property, allowing the possibility to reference the sources used by the LLM to generate its responses.
This is the first step towards enabling everyone to conveniently use citations for each agent, with built-in source and citation capabilities.
Changes
User experience
For certain LLMs that use the web or knowledge base systems, it might be beneficial to track the source of the response, such as a link to a website or a document in the knowledge base.
Checklist
If your change doesn't seem to apply, please leave them unchecked.
Is this a breaking change?
RFC issue number:
Checklist:
Acknowledgment
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.
Disclaimer: We value your time and bandwidth. As such, any pull requests created on non-triaged issues might not be successful.