Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community: fix perplexity response parameters not being included in model response #30440

Conversation

DavidSanSan110
Copy link
Contributor

This pull request includes enhancements to the perplexity.py file in the chat_models module, focusing on improving the handling of additional keyword arguments (additional_kwargs) in message processing methods. Additionally, new unit tests have been added to ensure the correct inclusion of citations, images, and related questions in the additional_kwargs.

Issue: resolves #30439

Enhancements to perplexity.py:

New unit tests:

  • libs/community/tests/unit_tests/chat_models/test_perplexity.py: Added new tests test_perplexity_stream_includes_citations_and_images and test_perplexity_stream_includes_citations_and_related_questions to verify that the stream method correctly includes citations, images, and related questions in the additional_kwargs.

Copy link

vercel bot commented Mar 23, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Mar 26, 2025 8:51pm

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. community Related to langchain-community 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Mar 23, 2025

chunk = self._convert_delta_to_message_chunk(
choice["delta"], default_chunk_class
choice["delta"], default_chunk_class, additional_kwargs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick but could we update chunk.additional_kwargs below in this function, as we were doing before, instead of passing it into _convert_delta_to_message_chunk and mutating it there?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've updated the code to add additional_kwargs after calling _convert_delta_to_message_chunk, rather than modifying it inside the function. 👌

@DavidSanSan110 DavidSanSan110 requested a review from ccurme March 25, 2025 20:29
@ccurme ccurme self-assigned this Mar 26, 2025
@ccurme ccurme requested a review from Copilot March 26, 2025 19:26

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR enhances the handling of additional keyword arguments in the Perplexity chat model responses and adds new tests to verify that citations, images, and related questions are correctly included.

  • Updated the _stream and _generate methods in perplexity.py to merge additional keyword arguments from API responses.
  • Added new unit tests to verify that the stream method correctly includes citations with images and related questions.

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
libs/community/langchain_community/chat_models/perplexity.py Updated _stream and _generate to build and merge additional_kwargs from API responses.
libs/community/tests/unit_tests/chat_models/test_perplexity.py Added tests to verify that citations, images, and related_questions are included in the streamed output.
Comments suppressed due to low confidence (2)

libs/community/tests/unit_tests/chat_models/test_perplexity.py:122

  • [nitpick] The test function's docstring indicates it only checks citations, yet the test also verifies that images are included. Consider updating the docstring to accurately reflect that both citations and images are being tested.
def test_perplexity_stream_includes_citations_and_images(mocker: MockerFixture) -> None:

libs/community/tests/unit_tests/chat_models/test_perplexity.py:211

  • [nitpick] The test function's docstring states it only checks citations while the test verifies both citations and related questions. Update the docstring for clarity.
def test_perplexity_stream_includes_citations_and_related_questions(mocker: MockerFixture) -> None:
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Mar 26, 2025
@ccurme ccurme merged commit 75823d5 into langchain-ai:master Mar 27, 2025
19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature community Related to langchain-community lgtm PR looks good. Use to confirm that a PR is ready for merging. size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

perplexity response parameters not being included in model response
2 participants