tfc.entropy_models.ContinuousBatchedEntropyModel #186
Unanswered
MehdiSattari
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am new to GitHub and TensorFlow Compression, and I'm attempting to implement code from a specific research paper (https://arxiv.org/abs/2101.08687) using TensorFlow Compression. I've successfully utilized tfc.entropy_models.ContinuousBatchedEntropyModel for latent entropy encoding.
Now, I'd like to extend this by encoding the weights, specifically the weight updates during training, using a spike-and-slab prior as described in the paper. This prior is essentially a Gaussian mixture distribution. Unlike the quantization used in tfc.entropy_models.ContinuousBatchedEntropyModel, the quantization for the weights is not based on unit bins. Additionally, the paper mentions the use of straight-through estimators to mitigate the quantization bottleneck.
I'm looking for guidance on how to modify tfc.entropy_models.ContinuousBatchedEntropyModel to support entropy encoding with the features I've described. Alternatively, I'd greatly appreciate it if you could recommend any Python modules that can be used to calculate the encoding rate and generate encoded messages using a spike-and-slab prior.
Beta Was this translation helpful? Give feedback.
All reactions