Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

The in-graph beam search #184

Open
rhuangq opened this issue Apr 21, 2017 · 3 comments
Open

The in-graph beam search #184

rhuangq opened this issue Apr 21, 2017 · 3 comments

Comments

@rhuangq
Copy link

rhuangq commented Apr 21, 2017

Hi @dennybritz The in-graph beam search is pretty nice. I have couple of questions. Can you please clarify?

  • If we need to save the inference graph for C++ deployment, are the configurations like beam width and length norm weight defined through a placeholder tensor or they have to be baked into the graph?
  • What's the inference speed look like on the NMT task (as described in your paper https://arxiv.org/abs/1703.03906, beam 10, K80 GPU)?
  • It may be a little inflexible if we want to use additional information like language model score to guide the search. Do you have any comments about that aspect?

Many Thanks!

@Syndrome777
Copy link

Hope to get comments about this issue too.

@howlinghuffy
Copy link

Would like some more info in this too, particularly on guiding the search.

@sathyarr
Copy link

It seems beam search functionality could not be added to inference graph incase if you need to use in C++ or TF Serving. Kindly refer this issue

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants