Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

De-dup concurrent requests in ramping, current and delete updates #7339

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

Shivs11
Copy link
Member

@Shivs11 Shivs11 commented Feb 13, 2025

What changed?

  • UpdateID for SetCurrent and SetRamping requests to be the hash of the input and conflict token (when provided)
  • Added an additional layer of validation for SetCurrent, SetRamping and DeleteVersion update requests inside each update handler.
  • Tests testing the above behaviour.

Why?

  • Right now, it's possible for concurrent updates to bypass the validator and get executed. This can be prevented by either using a unique updateID for those requests where a conflict token is provided or by adding an extra set of protection layer in the update handler.

How did you test it?

  • Added functional tests testing this behaviour for the three operations in consideration.

Potential risks

  • None, in fact this might make the feature better.

Documentation

Is hotfix candidate?

  • No

@Shivs11 Shivs11 requested a review from a team as a code owner February 13, 2025 20:40
@Shivs11 Shivs11 requested a review from carlydf February 13, 2025 21:20
// When a conflict token is provided, we use the hash of the conflict token, deploymentName and the version. This is done
// to ensure the automatic de-duplication of updates with the same UpdateID.

combinedBytes := make([]byte, 0, len(conflictToken)+len(deploymentName)+len(version))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm thinking it's better to just serialize the whole input and take its hash. It'll be safer for later when more inputs are added. and also it covers the whole request including client identity etc. (I think it client identity is different, we should not de-dup it)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about including client identity as part of the hash - If two exact requests are coming in from different call sites (hence different identities), we want them to be considered as duped right?

De-dup imo should be based solely on the inputs sent by clients and if the system has already accepted the same set of inputs, why let it do work again to achieve the same state? For requests that do get duped, we are also not returning errors and the right response objects so that calls from a different call site not face any odd errors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants