diff --git a/readme/nlp/anaphora_parser.md b/readme/nlp/anaphora_parser.md index d656762..c3275e6 100644 --- a/readme/nlp/anaphora_parser.md +++ b/readme/nlp/anaphora_parser.md @@ -2,9 +2,9 @@ ## Coreference and Anaphora Resolution - [2012 EMNLP] **Joint Entity and Event Coreference Resolution across Documents**, [[paper]](https://aclweb.org/anthology/D/D12/D12-1045.pdf). -- [2016 EMNLP] **Deep Reinforcement Learning for Mention-Ranking Coreference Models**, [[paper]](https://arxiv.org/abs/1609.08667), [[blog]](https://medium.com/huggingface/state-of-the-art-neural-coreference-resolution-for-chatbots-3302365dcf30), [[demo]](https://huggingface.co/coref/), sources: [[huggingface/neuralcoref]](https://github.com/huggingface/neuralcoref), [[clarkkev/deep-coref]](https://github.com/clarkkev/deep-coref). +- [2016 EMNLP] **Deep Reinforcement Learning for Mention-Ranking Coreference Models**, [[paper]](https://arxiv.org/pdf/1609.08667.pdf), [[blog]](https://medium.com/huggingface/state-of-the-art-neural-coreference-resolution-for-chatbots-3302365dcf30), [[demo]](https://huggingface.co/coref/), sources: [[huggingface/neuralcoref]](https://github.com/huggingface/neuralcoref), [[clarkkev/deep-coref]](https://github.com/clarkkev/deep-coref). - [2016 ACL] **Improving Coreference Resolution by Learning Entity-Level Distributed Representations**, [[paper]](https://cs.stanford.edu/people/kevclark/resources/clark-manning-acl16-improving.pdf), sources: [[clarkkev/deep-coref]](https://github.com/clarkkev/deep-coref). -- [2017 ArXiv] **Linguistic Knowledge as Memory for Recurrent Neural Networks**, [[paper]](https://arxiv.org/abs/1703.02620). +- [2017 ArXiv] **Linguistic Knowledge as Memory for Recurrent Neural Networks**, [[paper]](https://arxiv.org/pdf/1703.02620.pdf). ## Dependency Parser - [2014 EMNLP] **A Fast and Accurate Dependency Parser using Neural Networks**, [[paper]](https://cs.stanford.edu/~danqi/papers/emnlp2014.pdf), sources: [[akjindal53244/dependency_parsing_tf]](https://github.com/akjindal53244/dependency_parsing_tf), [[ljj314zz/dependency_parsing_tf-master]](https://github.com/ljj314zz/dependency_parsing_tf-master). @@ -17,4 +17,4 @@ ## Grammatical Error Correction - [2014 CoNLL] **The CoNLL-2014 Shared Task on Grammatical Error Correction**, [[paper]](http://www.aclweb.org/anthology/W14-1701), [[homepage]](http://www.comp.nus.edu.sg/~nlp/conll14st.html). -- [2018 AAAI] **A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction**, [[paper]](https://arxiv.org/abs/1801.08831), [[nusnlp/mlconvgec2018]](https://github.com/nusnlp/mlconvgec2018). +- [2018 AAAI] **A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction**, [[paper]](https://arxiv.org/pdf/1801.08831.pdf), [[nusnlp/mlconvgec2018]](https://github.com/nusnlp/mlconvgec2018). diff --git a/readme/nlp/commonsense.md b/readme/nlp/commonsense.md index 6259710..57ce49a 100644 --- a/readme/nlp/commonsense.md +++ b/readme/nlp/commonsense.md @@ -4,15 +4,15 @@ - [2013 NIPS] **Reasoning With Neural Tensor Networks for Knowledge Base Completion**, [[paper]](https://nlp.stanford.edu/pubs/SocherChenManningNg_NIPS2013.pdf), sources: [[siddharth-agrawal/Neural-Tensor-Network]](https://github.com/siddharth-agrawal/Neural-Tensor-Network), [[dddoss/tensorflow-socher-ntn]](https://github.com/dddoss/tensorflow-socher-ntn). - [2013 NIPS] **TransE: Translating Embeddings for Modeling Multi-relational Data**, [[paper]](https://papers.nips.cc/paper/5071-translating-embeddings-for-modeling-multi-relational-data.pdf), sources: [[thunlp/TensorFlow-TransX]](https://github.com/thunlp/TensorFlow-TransX). - [2014 AAAI] **TransH: Knowledge Graph Embedding by Translating on Hyperplanes**, [[paper]](https://www.aaai.org/ocs/index.php/AAAI/AAAI14/paper/view/8531/8546), sources: [[thunlp/TensorFlow-TransX]](https://github.com/thunlp/TensorFlow-TransX). -- [2015 EMNLP] **PTransE: Modeling Relation Paths for Representation Learning of Knowledge Bases**, [[paper]](https://arxiv.org/abs/1506.00379), [[homepage]](https://github.com/thunlp), sources: [[thunlp/Fast-TransX]](https://github.com/thunlp/Fast-TransX). +- [2015 EMNLP] **PTransE: Modeling Relation Paths for Representation Learning of Knowledge Bases**, [[paper]](https://arxiv.org/pdf/1506.00379.pdf), [[homepage]](https://github.com/thunlp), sources: [[thunlp/Fast-TransX]](https://github.com/thunlp/Fast-TransX). - [2015 AAAI] **TransR: Learning Entity and Relation Embeddings for Knowledge Graph Completion**, [[paper]](https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9571/9523), sources: [[thunlp/TensorFlow-TransX]](https://github.com/thunlp/TensorFlow-TransX). - [2015 ACL] **TransD: Knowledge Graph Embedding via Dynamic Mapping Matrix**, [[paper]](http://www.aclweb.org/anthology/P15-1067), sources: [[thunlp/TensorFlow-TransX]](https://github.com/thunlp/TensorFlow-TransX). - [2016 AAAI] **Knowledge Graph Completion with Adaptive Sparse Transfer Matrix**, [[paper]](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/view/11982/11693), sources: [[FrankWork/transparse]](https://github.com/FrankWork/transparse), [[thunlp/Fast-TransX]](https://github.com/thunlp/Fast-TransX). - [2016 ACL] **Commonsense Knowledge Base Completion**, [[paper]](http://ttic.uchicago.edu/~kgimpel/papers/li+etal.acl16.pdf), [[homepage]](http://ttic.uchicago.edu/~kgimpel/commonsense.html), sources: [[Lorraine333/ACL_CKBC]](https://github.com/Lorraine333/ACL_CKBC). -- [2017 AKBC] **RelNet: End-to-End Modeling of Entities & Relations**, [[paper]](https://arxiv.org/abs/1706.07179), [[homepage]](http://thetb.github.io). +- [2017 AKBC] **RelNet: End-to-End Modeling of Entities & Relations**, [[paper]](https://arxiv.org/pdf/1706.07179.pdf), [[homepage]](http://thetb.github.io). - [2017 EMNLP] **Context-Aware Representations for Knowledge Base Relation Extraction**, [[paper]](http://aclweb.org/anthology/D17-1188), sources: [[UKPLab/emnlp2017-relation-extraction]](https://github.com/UKPLab/emnlp2017-relation-extraction). - [2018 AAAI] **SenticNet 5: Discovering Conceptual Primitives for Sentiment Analysis by Means of Context Embeddings**, [[paper]](http://sentic.net/senticnet-5.pdf). ## Commonsense Knowledge Base and Its Usages -- [2017 AAAI] **ConceptNet 5.5: An Open Multilingual Graph of General Knowledge**, [[paper]](https://arxiv.org/abs/1612.03975), sources: [[GitHub page]](https://github.com/commonsense), [[commonsense/conceptnet5]](https://github.com/commonsense/conceptnet5), [[commonsense/conceptnet-numberbatch]](https://github.com/commonsense/conceptnet-numberbatch). +- [2017 AAAI] **ConceptNet 5.5: An Open Multilingual Graph of General Knowledge**, [[paper]](https://arxiv.org/pdf/1612.03975.pdf), sources: [[GitHub page]](https://github.com/commonsense), [[commonsense/conceptnet5]](https://github.com/commonsense/conceptnet5), [[commonsense/conceptnet-numberbatch]](https://github.com/commonsense/conceptnet-numberbatch). - [2017 CIKM] **Commonsense for Machine Intelligence: Text to Knowledge and Knowledge to Text**, [[slides]](http://people.mpi-inf.mpg.de/~ntandon/presentations/cikm-2017-tutorial-commonsense/commonsense.pdf), [[CIKM 2017 Singapore Tutorials]](http://cikm2017.org/tutorialmain.html), [[Commonsense for Machine Intelligence, Allen Institute, CIKM 2017 TUTORIAL]](http://allenai.org/tutorials/csk/), [[Allen Institute]](http://allenai.org/index.html). \ No newline at end of file diff --git a/readme/nlp/dialogue_system.md b/readme/nlp/dialogue_system.md index 5be49ea..c35de8e 100644 --- a/readme/nlp/dialogue_system.md +++ b/readme/nlp/dialogue_system.md @@ -1,15 +1,15 @@ # Dialogue / Conversation / Chatbot System - [2013 IEEE] **POMDP-based Statistical Spoken Dialogue Systems: a Review**, [[paper]](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/young2013procieee.pdf). -- [2014 NIPS] **Sequence to Sequence Learning with Neural Networks**, [[paper]](https://arxiv.org/abs/1409.3215), sources: [[farizrahman4u/seq2seq]](https://github.com/farizrahman4u/seq2seq), [[ma2rten/seq2seq]](https://github.com/ma2rten/seq2seq), [[JayParks/tf-seq2seq]](https://github.com/JayParks/tf-seq2seq), [[macournoyer/neuralconvo]](https://github.com/macournoyer/neuralconvo). -- [2015 CIKM] **A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion**, [[paper]](https://arxiv.org/abs/1507.02221.pdf), sources: [[sordonia/hred-qs]](https://github.com/sordonia/hred-qs). -- [2015 EMNLP] **Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems**, [[paper]](https://arxiv.org/abs/1508.01745), sources: [[shawnwun/RNNLG]](https://github.com/shawnwun/RNNLG), [[hit-computer/SC-LSTM]](https://github.com/hit-computer/SC-LSTM). -- [2015 ArXiv] **Attention with Intention for a Neural Network Conversation Model**, [[paper]](https://arxiv.org/abs/1510.08565). -- [2015 ACL] **Neural Responding Machine for Short-Text Conversation**, [[paper]](https://arxiv.org/abs/1503.02364). -- [2016 AAAI] **Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models**, [[paper]](https://arxiv.org/abs/1507.04808), sources: [[suriyadeepan/augmented_seq2seq]](https://github.com/suriyadeepan/augmented_seq2seq), [[julianser/hed-dlg]](https://github.com/julianser/hed-dlg), [[sordonia/hed-dlg]](https://github.com/sordonia/hed-dlg), [[julianser/hred-latent-piecewise]](https://github.com/julianser/hred-latent-piecewise), [[julianser/hed-dlg-truncated]](https://github.com/julianser/hed-dlg-truncated). -- [2016 ACL] **On-line Active Reward Learning for Policy Optimisation in Spoken Dialogue Systems**, [[paper]](https://arxiv.org/abs/1605.07669). -- [2016 EMNLP] **Deep Reinforcement Learning for Dialogue Generation**, [[paper]](https://arxiv.org/abs/1606.01541), sources: [[liuyuemaicha/Deep-Reinforcement-Learning-for-Dialogue-Generation-in-tensorflow]](https://github.com/liuyuemaicha/Deep-Reinforcement-Learning-for-Dialogue-Generation-in-tensorflow). +- [2014 NIPS] **Sequence to Sequence Learning with Neural Networks**, [[paper]](https://arxiv.org/pdf/1409.3215.pdf), sources: [[farizrahman4u/seq2seq]](https://github.com/farizrahman4u/seq2seq), [[ma2rten/seq2seq]](https://github.com/ma2rten/seq2seq), [[JayParks/tf-seq2seq]](https://github.com/JayParks/tf-seq2seq), [[macournoyer/neuralconvo]](https://github.com/macournoyer/neuralconvo). +- [2015 CIKM] **A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion**, [[paper]](https://arxiv.org/pdf/1507.02221.pdf), sources: [[sordonia/hred-qs]](https://github.com/sordonia/hred-qs). +- [2015 EMNLP] **Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems**, [[paper]](https://arxiv.org/pdf/1508.01745.pdf), sources: [[shawnwun/RNNLG]](https://github.com/shawnwun/RNNLG), [[hit-computer/SC-LSTM]](https://github.com/hit-computer/SC-LSTM). +- [2015 ArXiv] **Attention with Intention for a Neural Network Conversation Model**, [[paper]](https://arxiv.org/pdf/1510.08565.pdf). +- [2015 ACL] **Neural Responding Machine for Short-Text Conversation**, [[paper]](https://arxiv.org/pdf/1503.02364.pdf). +- [2016 AAAI] **Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models**, [[paper]](https://arxiv.org/pdf/1507.04808.pdf), sources: [[suriyadeepan/augmented_seq2seq]](https://github.com/suriyadeepan/augmented_seq2seq), [[julianser/hed-dlg]](https://github.com/julianser/hed-dlg), [[sordonia/hed-dlg]](https://github.com/sordonia/hed-dlg), [[julianser/hred-latent-piecewise]](https://github.com/julianser/hred-latent-piecewise), [[julianser/hed-dlg-truncated]](https://github.com/julianser/hed-dlg-truncated). +- [2016 ACL] **On-line Active Reward Learning for Policy Optimisation in Spoken Dialogue Systems**, [[paper]](https://arxiv.org/pdf/1605.07669.pdf). +- [2016 EMNLP] **Deep Reinforcement Learning for Dialogue Generation**, [[paper]](https://arxiv.org/pdf/1606.01541.pdf), sources: [[liuyuemaicha/Deep-Reinforcement-Learning-for-Dialogue-Generation-in-tensorflow]](https://github.com/liuyuemaicha/Deep-Reinforcement-Learning-for-Dialogue-Generation-in-tensorflow). - [2016 EMNLP] **Multi-view Response Selection for Human-Computer Conversation**, [[paper]](http://www.aclweb.org/anthology/D16-1036). -- [2017 ACM] **A Survey on Dialogue Systems: Recent Advances and New Frontiers**, [[paper]](https://arxiv.org/abs/1711.01731.pdf), sources: [[shawnspace/survey-in-dialog-system]](https://github.com/shawnspace/survey-in-dialog-system). -- [2017 EMNLP] **Adversarial Learning for Neural Dialogue Generation**, [[paper]](https://arxiv.org/abs/1701.06547), sources: [[jiweil/Neural-Dialogue-Generation]](https://github.com/jiweil/Neural-Dialogue-Generation), [[liuyuemaicha/Adversarial-Learning-for-Neural-Dialogue-Generation-in-Tensorflow]](https://github.com/liuyuemaicha/Adversarial-Learning-for-Neural-Dialogue-Generation-in-Tensorflow). -- [2017 ACL] **Sequential Matching Network: A New Architecture for Multi-turn Response Selection in Retrieval-Based Chatbots**, [[paper]](https://arxiv.org/abs/1612.01627), sources: [[MarkWuNLP/MultiTurnResponseSelection]](https://github.com/MarkWuNLP/MultiTurnResponseSelection), [[krayush07/sequential-match-network]](https://github.com/krayush07/sequential-match-network). +- [2017 ACM] **A Survey on Dialogue Systems: Recent Advances and New Frontiers**, [[paper]](https://arxiv.org/pdf/1711.01731.pdf), sources: [[shawnspace/survey-in-dialog-system]](https://github.com/shawnspace/survey-in-dialog-system). +- [2017 EMNLP] **Adversarial Learning for Neural Dialogue Generation**, [[paper]](https://arxiv.org/pdf/1701.06547.pdf), sources: [[jiweil/Neural-Dialogue-Generation]](https://github.com/jiweil/Neural-Dialogue-Generation), [[liuyuemaicha/Adversarial-Learning-for-Neural-Dialogue-Generation-in-Tensorflow]](https://github.com/liuyuemaicha/Adversarial-Learning-for-Neural-Dialogue-Generation-in-Tensorflow). +- [2017 ACL] **Sequential Matching Network: A New Architecture for Multi-turn Response Selection in Retrieval-Based Chatbots**, [[paper]](https://arxiv.org/pdf/1612.01627.pdf), sources: [[MarkWuNLP/MultiTurnResponseSelection]](https://github.com/MarkWuNLP/MultiTurnResponseSelection), [[krayush07/sequential-match-network]](https://github.com/krayush07/sequential-match-network). diff --git a/readme/nlp/embeddings.md b/readme/nlp/embeddings.md index 288427b..a2c1cdc 100644 --- a/readme/nlp/embeddings.md +++ b/readme/nlp/embeddings.md @@ -6,17 +6,17 @@ ## Word Embedding - [2008 NIPS] **HLBL: A Scalable Hierarchical Distributed Language Model**, [[paper]](http://www.cs.toronto.edu/~fritz/absps/andriytree.pdf), [[wenjieguan/Log-bilinear-language-models]](https://github.com/wenjieguan/Log-bilinear-language-models). - [2010 INTERSPEECH] **RNNLM: Recurrent neural network based language model**, [[paper]](http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf), [[Ph.D. Thesis]](http://www.fit.vutbr.cz/~imikolov/rnnlm/thesis.pdf), [[slides]](http://www.fit.vutbr.cz/~imikolov/rnnlm/google.pdf), sources: [[mspandit/rnnlm]](https://github.com/mspandit/rnnlm). -- [2013 NIPS] **Word2Vec: Distributed Representations of Words and Phrases and their Compositionality**, [[paper]](https://arxiv.org/abs/1310.4546), [[word2vec explained]](https://arxiv.org/abs/1402.3722), [[params explained]](https://arxiv.org/abs/1411.2738), [[blog]](https://isaacchanghau.github.io/post/word2vec/), sources: [[word2vec]](https://code.google.com/archive/p/word2vec/), [[dav/word2vec]](https://github.com/dav/word2vec), [[yandex/faster-rnnlm]](https://github.com/yandex/faster-rnnlm), [[tf-word2vec]](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/tutorials/word2vec), [[zake7749/word2vec-tutorial]](https://github.com/zake7749/word2vec-tutorial). +- [2013 NIPS] **Word2Vec: Distributed Representations of Words and Phrases and their Compositionality**, [[paper]](https://arxiv.org/pdf/1310.4546.pdf), [[word2vec explained]](https://arxiv.org/pdf/1402.3722.pdf), [[params explained]](https://arxiv.org/pdf/1411.2738.pdf), [[blog]](https://isaacchanghau.github.io/post/word2vec/), sources: [[word2vec]](https://code.google.com/archive/p/word2vec/), [[dav/word2vec]](https://github.com/dav/word2vec), [[yandex/faster-rnnlm]](https://github.com/yandex/faster-rnnlm), [[tf-word2vec]](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/tutorials/word2vec), [[zake7749/word2vec-tutorial]](https://github.com/zake7749/word2vec-tutorial). - [2013 CoNLL] **Better Word Representations with Recursive Neural Networks for Morphology**, [[paper]](https://nlp.stanford.edu/~lmthang/data/papers/conll13_morpho.pdf). - [2014 ACL] **Word2Vecf: Dependency-Based Word Embeddings**, [[paper]](http://www.aclweb.org/anthology/P14-2050), [[blog]](https://isaacchanghau.github.io/post/word2vecf/), sources: [[Yoav Goldberg/word2vecf]](https://bitbucket.org/yoavgo/word2vecf), [[IsaacChanghau/Word2VecfJava]](https://github.com/IsaacChanghau/Word2VecfJava). - [2014 EMNLP] **GloVe: Global Vectors for Word Representation**, [[paper]](https://nlp.stanford.edu/pubs/glove.pdf), [[homepage]](https://nlp.stanford.edu/projects/glove/), sources: [[stanfordnlp/GloVe]](https://github.com/stanfordnlp/GloVe). - [2014 ICML] **Compositional Morphology for Word Representations and Language Modelling**, [[paper]](http://proceedings.mlr.press/v32/botha14.pdf), sources: [[thompsonb/comp-morph]](https://github.com/thompsonb/comp-morph), [[claravania/subword-lstm-lm]](https://github.com/claravania/subword-lstm-lm). - [2015 ACL] **Hyperword: Improving Distributional Similarity with Lessons Learned from Word Embeddings**, [[paper]](http://www.aclweb.org/anthology/Q15-1016), sources: [[Omer Levy/hyperwords]](https://bitbucket.org/omerlevy/hyperwords). -- [2016 ICLR] **Exploring the Limits of Language Modeling**, [[paper]](https://arxiv.org/abs/1602.02410.pdf), [[slides]](https://www.cs.toronto.edu/~duvenaud/courses/csc2541/slides/lipnet.pdf), sources: [[tensorflow/models/lm_1b]](https://github.com/tensorflow/models/tree/master/research/lm_1b). +- [2016 ICLR] **Exploring the Limits of Language Modeling**, [[paper]](https://arxiv.org/pdf/1602.02410.pdf), [[slides]](https://www.cs.toronto.edu/~duvenaud/courses/csc2541/slides/lipnet.pdf), sources: [[tensorflow/models/lm_1b]](https://github.com/tensorflow/models/tree/master/research/lm_1b). - [2016 CoNLL] **Context2Vec: Learning Generic Context Embedding with Bidirectional LSTM**, [[paper]](http://www.aclweb.org/anthology/K16-1006), sources: [[orenmel/context2vec]](https://github.com/orenmel/context2vec). -- [2016 IEEE Intelligent Systems] **How to Generate a Good Word Embedding?**, [[paper]](https://arxiv.org/abs/1507.05523), [[基于神经网络的词和文档语义向量表示方法研究]](https://arxiv.org/pdf/1611.05962.pdf), [[blog]](http://licstar.net/archives/620), sources: [[licstar/compare]](https://github.com/licstar/compare). -- [2016 ArXiv] **Linear Algebraic Structure of Word Senses, with Applications to Polysemy**, [[paper]](https://arxiv.org/abs/1601.03764.pdf), [[slides]](https://pdfs.semanticscholar.org/d770/5adf01fc9791337ed17dd37236129ef3a0f4.pdf), sources: [[YingyuLiang/SemanticVector]](https://github.com/YingyuLiang/SemanticVector). -- [2017 ACL] **FastText: Enriching Word Vectors with Subword Information**, [[paper]](https://arxiv.org/abs/1607.04606.pdf), sources: [[facebookresearch/fastText]](https://github.com/facebookresearch/fastText), [[salestock/fastText.py]](https://github.com/salestock/fastText.py). +- [2016 IEEE Intelligent Systems] **How to Generate a Good Word Embedding?**, [[paper]](https://arxiv.org/pdf/1507.05523.pdf), [[基于神经网络的词和文档语义向量表示方法研究]](https://arxiv.org/pdf/1611.05962.pdf), [[blog]](http://licstar.net/archives/620), sources: [[licstar/compare]](https://github.com/licstar/compare). +- [2016 ArXiv] **Linear Algebraic Structure of Word Senses, with Applications to Polysemy**, [[paper]](https://arxiv.org/pdf/1601.03764.pdf), [[slides]](https://pdfs.semanticscholar.org/d770/5adf01fc9791337ed17dd37236129ef3a0f4.pdf), sources: [[YingyuLiang/SemanticVector]](https://github.com/YingyuLiang/SemanticVector). +- [2017 ACL] **FastText: Enriching Word Vectors with Subword Information**, [[paper]](https://arxiv.org/pdf/1607.04606.pdf), sources: [[facebookresearch/fastText]](https://github.com/facebookresearch/fastText), [[salestock/fastText.py]](https://github.com/salestock/fastText.py). - [2017 ICLR] **A Simple But Tough-to-beat Baseline for Sentence Embeddings**, [[paper]](https://openreview.net/pdf?id=SyK00v5xx), sources: [[PrincetonML/SIF]](https://github.com/PrincetonML/SIF). - [2017 NIPS] **Learned in Translation: Contextualized Word Vectors**, [[paper]](https://arxiv.org/pdf/1708.00107.pdf), sources: [[salesforce/cove]](https://github.com/salesforce/cove). - [2017 ArXiv] **Implicitly Incorporating Morphological Information into Word Embedding**, [[paper]](https://arxiv.org/pdf/1701.02481.pdf). diff --git a/readme/nlp/machine_comprehension.md b/readme/nlp/machine_comprehension.md index 6c2389d..9363473 100644 --- a/readme/nlp/machine_comprehension.md +++ b/readme/nlp/machine_comprehension.md @@ -3,30 +3,30 @@ ## Dataset - [2013 EMNLP] **MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text**, [[paper]](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/11/MCTest_EMNLP2013.pdf), [[homepage]](https://mattr1.github.io/mctest/), source: [[mcobzarenco/mctest]](https://github.com/mcobzarenco/mctest). - [2015 NIPS] **CNN/DailyMail: Teaching Machines to Read and Comprehend**, [[paper]](https://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend.pdf), [[homepage]](https://cs.nyu.edu/~kcho/DMQA/), sources: [[thomasmesnard/DeepMind-Teaching-Machines-to-Read-and-Comprehend]](https://github.com/thomasmesnard/DeepMind-Teaching-Machines-to-Read-and-Comprehend). -- [2016 EMNLP] **SQuAD 100,000+ Questions for Machine Comprehension of Text**, [[paper]](https://arxiv.org/abs/1606.05250.pdf), [[homepage]](https://rajpurkar.github.io/SQuAD-explorer/). -- [2016 ICLR] **bAbI: Towards AI-Complete Question Answering: a Set of Prerequisite Toy Tasks**, [[paper]](https://arxiv.org/abs/1502.05698.pdf), [[homepage]](https://research.fb.com/downloads/babi/), sources: [[facebook/bAbI-tasks]](https://github.com/facebook/bAbI-tasks). +- [2016 EMNLP] **SQuAD 100,000+ Questions for Machine Comprehension of Text**, [[paper]](https://arxiv.org/pdf/1606.05250.pdf), [[homepage]](https://rajpurkar.github.io/SQuAD-explorer/). +- [2016 ICLR] **bAbI: Towards AI-Complete Question Answering: a Set of Prerequisite Toy Tasks**, [[paper]](https://arxiv.org/pdf/1502.05698.pdf), [[homepage]](https://research.fb.com/downloads/babi/), sources: [[facebook/bAbI-tasks]](https://github.com/facebook/bAbI-tasks). - [2017 EMNLP] **World Knowledge for Reading Comprehension: Rare Entity Prediction with Hierarchical LSTMs Using External Descriptions**, [[paper]](http://aclweb.org/anthology/D17-1086), [[homepage]](http://dataset.cs.mcgill.ca/downloads/rare_entity_dataset.html). - [2017 EMNLP] **RACE: Large-scale ReAding Comprehension Dataset From Examinations**, [[paper]](https://arxiv.org/pdf/1704.04683.pdf), [[homepage]](http://www.cs.cmu.edu/~glai1/data/race/), sources: [[qizhex/RACE_AR_baselines]](https://github.com/qizhex/RACE_AR_baselines). -- [2017 ACL] **TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension**, [[paper]](https://arxiv.org/abs/1705.03551.pdf), [[homepage]](http://nlp.cs.washington.edu/triviaqa/), sources: [[mandarjoshi90/triviaqa]](https://github.com/mandarjoshi90/triviaqa). +- [2017 ACL] **TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension**, [[paper]](https://arxiv.org/pdf/1705.03551.pdf), [[homepage]](http://nlp.cs.washington.edu/triviaqa/), sources: [[mandarjoshi90/triviaqa]](https://github.com/mandarjoshi90/triviaqa). - [2017 ArXiv] **QAngaroo: Constructing Datasets for Multi-hop Reading Comprehension Across Documents**, [[paper]](https://arxiv.org/pdf/1710.06481.pdf), [[homepage]](http://qangaroo.cs.ucl.ac.uk), -- [2018 ICLR] **CLOTH: Large-scale Cloze Test Dataset Designed by Teachers**, [[paper]](https://arxiv.org/abs/1711.03225.pdf), [[homepage]](http://www.qizhexie.com), sources: [[qizhex/Large-scale-Cloze-Test-Dataset-Designed-by-Teachers]](https://github.com/qizhex/Large-scale-Cloze-Test-Dataset-Designed-by-Teachers). +- [2018 ICLR] **CLOTH: Large-scale Cloze Test Dataset Designed by Teachers**, [[paper]](https://arxiv.org/pdf/1711.03225.pdf), [[homepage]](http://www.qizhexie.com), sources: [[qizhex/Large-scale-Cloze-Test-Dataset-Designed-by-Teachers]](https://github.com/qizhex/Large-scale-Cloze-Test-Dataset-Designed-by-Teachers). - [2018 NAACL] **MultiRC: Looking Beyond the Surface -- A Challenge Set for Reading Comprehension over Multiple Sentences**, [[paper]](http://cogcomp.org/papers/2018-MultiRC-NAACL.pdf), [[homepage]](http://cogcomp.org/multirc/), sources: [[CogComp/multirc]](https://github.com/CogComp/multirc/). ## Machine Comprehension - [2014 NIPS] **Deep Learning for Answer Sentence Selection**, [[paper]](https://arxiv.org/pdf/1412.1632.pdf), sources: [[brmson/Sentence-selection]](https://github.com/brmson/Sentence-selection). - [2015 NIPS] **Pointer Networks**, [[paper]](https://arxiv.org/pdf/1506.03134.pdf), [[blog]](http://fastml.com/introduction-to-pointer-networks/), sources: [[devsisters/pointer-network-tensorflow]](https://github.com/devsisters/pointer-network-tensorflow), [[https://github.com/ikostrikov/TensorFlow-Pointer-Networks]](https://github.com/ikostrikov/TensorFlow-Pointer-Networks), [[keon/pointer-networks]](https://github.com/keon/pointer-networks), [[pemami4911/neural-combinatorial-rl-pytorch]](https://github.com/pemami4911/neural-combinatorial-rl-pytorch), [[shiretzet/PointerNet]](https://github.com/shiretzet/PointerNet). -- [2016 ICLR] **LSTM-based Deep Learning Models for Non-factoid Answer Selection**, [[paper]](https://arxiv.org/abs/1511.04108.pdf), sources: [[Alan-Lee123/answer-selection]](https://github.com/Alan-Lee123/answer-selection), [[tambetm/allenAI]](https://github.com/tambetm/allenAI). -- [2016 ACL] **A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task**, [[paper]](https://arxiv.org/abs/1606.02858.pdf), sources: [[danqi/rc-cnn-dailymail]](https://github.com/danqi/rc-cnn-dailymail). -- [2017 ICLR] **Query-Reduction Networks for Question Answering**, [[paper]](https://arxiv.org/abs/1606.04582.pdf), [[homepage]](http://uwnlp.github.io/qrn/), sources: [[uwnlp/qrn]](https://github.com/uwnlp/qrn). -- [2017 ICLR] **Bi-Directional Attention Flow for Machine Comprehension**, [[paper]](https://arxiv.org/abs/1611.01603.pdf), [[homepage]](https://allenai.github.io/bi-att-flow/), [[demo]](http://allgood.cs.washington.edu:1995), sources: [[allenai/bi-att-flow]](https://github.com/allenai/bi-att-flow). +- [2016 ICLR] **LSTM-based Deep Learning Models for Non-factoid Answer Selection**, [[paper]](https://arxiv.org/pdf/1511.04108.pdf), sources: [[Alan-Lee123/answer-selection]](https://github.com/Alan-Lee123/answer-selection), [[tambetm/allenAI]](https://github.com/tambetm/allenAI). +- [2016 ACL] **A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task**, [[paper]](https://arxiv.org/pdf/1606.02858.pdf), sources: [[danqi/rc-cnn-dailymail]](https://github.com/danqi/rc-cnn-dailymail). +- [2017 ICLR] **Query-Reduction Networks for Question Answering**, [[paper]](https://arxiv.org/pdf/1606.04582.pdf), [[homepage]](http://uwnlp.github.io/qrn/), sources: [[uwnlp/qrn]](https://github.com/uwnlp/qrn). +- [2017 ICLR] **Bi-Directional Attention Flow for Machine Comprehension**, [[paper]](https://arxiv.org/pdf/1611.01603.pdf), [[homepage]](https://allenai.github.io/bi-att-flow/), [[demo]](http://allgood.cs.washington.edu:1995), sources: [[allenai/bi-att-flow]](https://github.com/allenai/bi-att-flow). - [2017 ACL] **R-Net: Machine Reading Comprehension with Self-matching Networks**, [[paper]](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/r-net.pdf), [[blog]](http://yerevann.github.io/2017/08/25/challenges-of-reproducing-r-net-neural-network-using-keras/), sources: [[HKUST-KnowComp/R-Net]](https://github.com/HKUST-KnowComp/R-Net), [[YerevaNN/R-NET-in-Keras]](https://github.com/YerevaNN/R-NET-in-Keras), [[minsangkim142/R-net]](https://github.com/minsangkim142/R-net). -- [2017 ArXiv] **Simple and Effective Multi-Paragraph Reading Comprehension**, [[paper]](https://arxiv.org/abs/1710.10723.pdf), sources: [[allenai/document-qa]](https://github.com/allenai/document-qa). -- [2017 CoNLL] **Making Neural QA as Simple as Possible but not Simpler**, [[paper]](https://arxiv.org/abs/1703.04816.pdf), [[homepage]](https://dirkweissenborn.github.io/publications.html), [[github-page]](https://github.com/georgwiese), sources: [[georgwiese/biomedical-qa]](https://github.com/georgwiese/biomedical-qa). -- [2017 EMNLP] **Two-Stage Synthesis Networks for Transfer Learning in Machine Comprehension**, [[paper]](https://arxiv.org/abs/1706.09789.pdf), sources: [[davidgolub/QuestionGeneration]](https://github.com/davidgolub/QuestionGeneration). -- [2017 ACL] **Attention-over-Attention Neural Networks for Reading Comprehension**, [[paper]](https://arxiv.org/abs/1607.04423.pdf), sources: [[OlavHN/attention-over-attention]](https://github.com/OlavHN/attention-over-attention), [[marshmelloX/attention-over-attention]](https://github.com/marshmelloX/attention-over-attention). -- [2018 ICLR] **MaskGAN: Better Text Generation via Filling in the `______`**, [[paper]](https://arxiv.org/abs/1801.07736.pdf). -- [2018 AAAI] **Multi-attention Recurrent Network for Human Communication Comprehension**, [[paper]](https://arxiv.org/abs/1802.00923.pdf). -- [2018 ICLR] **FusionNet: Fusing via Fully-aware Attention with Application to Machine Comprehension**, [[paper]](https://arxiv.org/abs/1711.07341), sources: [[exe1023/FusionNet]](https://github.com/exe1023/FusionNet), [[momohuang/FusionNet-NLI]](https://github.com/momohuang/FusionNet-NLI). +- [2017 ArXiv] **Simple and Effective Multi-Paragraph Reading Comprehension**, [[paper]](https://arxiv.org/pdf/1710.10723.pdf), sources: [[allenai/document-qa]](https://github.com/allenai/document-qa). +- [2017 CoNLL] **Making Neural QA as Simple as Possible but not Simpler**, [[paper]](https://arxiv.org/pdf/1703.04816.pdf), [[homepage]](https://dirkweissenborn.github.io/publications.html), [[github-page]](https://github.com/georgwiese), sources: [[georgwiese/biomedical-qa]](https://github.com/georgwiese/biomedical-qa). +- [2017 EMNLP] **Two-Stage Synthesis Networks for Transfer Learning in Machine Comprehension**, [[paper]](https://arxiv.org/pdf/1706.09789.pdf), sources: [[davidgolub/QuestionGeneration]](https://github.com/davidgolub/QuestionGeneration). +- [2017 ACL] **Attention-over-Attention Neural Networks for Reading Comprehension**, [[paper]](https://arxiv.org/pdf/1607.04423.pdf), sources: [[OlavHN/attention-over-attention]](https://github.com/OlavHN/attention-over-attention), [[marshmelloX/attention-over-attention]](https://github.com/marshmelloX/attention-over-attention). +- [2018 ICLR] **MaskGAN: Better Text Generation via Filling in the `______`**, [[paper]](https://arxiv.org/pdf/1801.07736.pdf). +- [2018 AAAI] **Multi-attention Recurrent Network for Human Communication Comprehension**, [[paper]](https://arxiv.org/pdf/1802.00923.pdf). +- [2018 ICLR] **FusionNet: Fusing via Fully-aware Attention with Application to Machine Comprehension**, [[paper]](https://arxiv.org/pdf/1711.07341.pdf), sources: [[exe1023/FusionNet]](https://github.com/exe1023/FusionNet), [[momohuang/FusionNet-NLI]](https://github.com/momohuang/FusionNet-NLI). - [2018 NAACL] **Contextualized Word Representations for Reading Comprehension**, [[paper]](https://arxiv.org/pdf/1712.03609.pdf), sources: [[shimisalant/CWR]](https://github.com/shimisalant/CWR). - [2018 ICLR] **QANet: Combing Local Convolution with Global Self-Attention for Reading Comprehension**, [[paper]](https://arxiv.org/pdf/1804.09541.pdf), sources: [[hengruo/QANet-pytorch]](https://github.com/hengruo/QANet-pytorch), [[NLPLearn/QANet]](https://github.com/NLPLearn/QANet). - [2018 ACL] **Knowledgeable Reader: Enhancing Cloze-Style Reading Comprehension with External Commonsense Knowledge**, [[paper]](https://arxiv.org/pdf/1805.07858.pdf). @@ -34,18 +34,18 @@ ## Question Answering on Knowledgebase - [2014 ACL] **Freebase QA: Information Extraction or Semantic Parsing?**, [[paper]](http://aclweb.org/anthology/W14-2416). - [2016 ACL] **Question Answering on Freebase via Relation Extraction and Textual Evidence**, [[paper]](http://www.aclweb.org/anthology/P16-1220), sources: [[syxu828/QuestionAnsweringOverFB]](https://github.com/syxu828/QuestionAnsweringOverFB). -- [2017 ACL] **An End-to-End Model for Question Answering over Knowledge Base with Cross-Attention Combining Global Knowledge**, [[paper]](https://arxiv.org/abs/1606.00979.pdf), [[homepage]](http://www.nlpr.ia.ac.cn/cip/~liukang/index.html), [[blog]](http://blog.csdn.net/LAW_130625/article/details/78484866). -- [2017 ACL] **Improved Neural Relation Detection for Knowledge Base Question Answering**, [[paper]](https://arxiv.org/abs/1704.06194.pdf). -- [2017 ACL] **Reading Wikipedia to Answer Open-Domain Questions**, [[paper]](https://arxiv.org/abs/1704.00051.pdf), sources: [[facebookresearch/DrQA]](https://github.com/facebookresearch/DrQA), [[hitvoice/DrQA]](https://github.com/hitvoice/DrQA). -- [2017 ArXiv] **Dynamic Integration of Background Knowledge in Neural NLU Systems**, [[paper]](https://arxiv.org/abs/1706.02596.pdf), [[homepage]](https://dirkweissenborn.github.io/publications.html). -- [2018 ArXiv] **An Attention-Based Word-Level Interaction Model: Relation Detection for Knowledge Base Question Answering**, [[paper]](https://arxiv.org/abs/1801.09893.pdf). +- [2017 ACL] **An End-to-End Model for Question Answering over Knowledge Base with Cross-Attention Combining Global Knowledge**, [[paper]](https://arxiv.org/pdf/1606.00979.pdf), [[homepage]](http://www.nlpr.ia.ac.cn/cip/~liukang/index.html), [[blog]](http://blog.csdn.net/LAW_130625/article/details/78484866). +- [2017 ACL] **Improved Neural Relation Detection for Knowledge Base Question Answering**, [[paper]](https://arxiv.org/pdf/1704.06194.pdf). +- [2017 ACL] **Reading Wikipedia to Answer Open-Domain Questions**, [[paper]](https://arxiv.org/pdf/1704.00051.pdf), sources: [[facebookresearch/DrQA]](https://github.com/facebookresearch/DrQA), [[hitvoice/DrQA]](https://github.com/hitvoice/DrQA). +- [2017 ArXiv] **Dynamic Integration of Background Knowledge in Neural NLU Systems**, [[paper]](https://arxiv.org/pdf/1706.02596.pdf), [[homepage]](https://dirkweissenborn.github.io/publications.html). +- [2018 ArXiv] **An Attention-Based Word-Level Interaction Model: Relation Detection for Knowledge Base Question Answering**, [[paper]](https://arxiv.org/pdf/1801.09893.pdf). - [2018 SemEval] **Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension**, [[paper]](https://arxiv.org/pdf/1803.00191.pdf), sources: [[intfloat/commonsense-rc]](https://github.com/intfloat/commonsense-rc). ## Memory Networks -- [2015 ICLR] **Memory Networks**, [[paper]](https://arxiv.org/abs/1410.3916.pdf), sources: [[facebook/MemNN]](https://github.com/facebook/MemNN). -- [2015 NIPS] **End-To-End Memory Networks**, [[paper]](https://arxiv.org/abs/1503.08895.pdf), sources: [[facebook/MemNN]](https://github.com/facebook/MemNN), [[seominjoon/memnn-tensorflow]](https://github.com/seominjoon/memnn-tensorflow), [[domluna/memn2n]](https://github.com/domluna/memn2n), [[carpedm20/MemN2N-tensorflow]](https://github.com/carpedm20/MemN2N-tensorflow). -- [2016 ICML] **Dynamic Memory Networks for Visual and Textual Question Answering**, [[paper]](https://arxiv.org/abs/1603.01417), [[blog]](https://yerevann.github.io/2016/02/05/implementing-dynamic-memory-networks/), sources: [[therne/dmn-tensorflow]](https://github.com/therne/dmn-tensorflow), [[barronalex/Dynamic-Memory-Networks-in-TensorFlow]](https://github.com/barronalex/Dynamic-Memory-Networks-in-TensorFlow), [[ethancaballero/Improved-Dynamic-Memory-Networks-DMN-plus]](https://github.com/ethancaballero/Improved-Dynamic-Memory-Networks-DMN-plus), [[dandelin/Dynamic-memory-networks-plus-Pytorch]](https://github.com/dandelin/Dynamic-memory-networks-plus-Pytorch), [[DeepRNN/visual_question_answering]](https://github.com/DeepRNN/visual_question_answering). -- [2016 ICML] **Ask Me Anything: Dynamic Memory Networks for Natural Language Processing**, [[paper]](https://arxiv.org/abs/1506.07285), sources: [[DongjunLee/dmn-tensorflow]](https://github.com/DongjunLee/dmn-tensorflow). +- [2015 ICLR] **Memory Networks**, [[paper]](https://arxiv.org/pdf/1410.3916.pdf), sources: [[facebook/MemNN]](https://github.com/facebook/MemNN). +- [2015 NIPS] **End-To-End Memory Networks**, [[paper]](https://arxiv.org/pdf/1503.08895.pdf), sources: [[facebook/MemNN]](https://github.com/facebook/MemNN), [[seominjoon/memnn-tensorflow]](https://github.com/seominjoon/memnn-tensorflow), [[domluna/memn2n]](https://github.com/domluna/memn2n), [[carpedm20/MemN2N-tensorflow]](https://github.com/carpedm20/MemN2N-tensorflow). +- [2016 ICML] **Dynamic Memory Networks for Visual and Textual Question Answering**, [[paper]](https://arxiv.org/pdf/1603.01417.pdf), [[blog]](https://yerevann.github.io/2016/02/05/implementing-dynamic-memory-networks/), sources: [[therne/dmn-tensorflow]](https://github.com/therne/dmn-tensorflow), [[barronalex/Dynamic-Memory-Networks-in-TensorFlow]](https://github.com/barronalex/Dynamic-Memory-Networks-in-TensorFlow), [[ethancaballero/Improved-Dynamic-Memory-Networks-DMN-plus]](https://github.com/ethancaballero/Improved-Dynamic-Memory-Networks-DMN-plus), [[dandelin/Dynamic-memory-networks-plus-Pytorch]](https://github.com/dandelin/Dynamic-memory-networks-plus-Pytorch), [[DeepRNN/visual_question_answering]](https://github.com/DeepRNN/visual_question_answering). +- [2016 ICML] **Ask Me Anything: Dynamic Memory Networks for Natural Language Processing**, [[paper]](https://arxiv.org/pdf/1506.07285.pdf), sources: [[DongjunLee/dmn-tensorflow]](https://github.com/DongjunLee/dmn-tensorflow). ## Modified LSTM/GRU for Machine Comprehension - [2016 EMNLP] **Long Short-Term Memory-Networks for Machine Reading**, [[paper]](https://arxiv.org/pdf/1601.06733.pdf), sources: [[cheng6076/SNLI-attention]](https://github.com/cheng6076/SNLI-attention), [[vsitzmann/snli-attention-tensorflow]](https://github.com/vsitzmann/snli-attention-tensorflow). diff --git a/readme/nlp/machine_translation.md b/readme/nlp/machine_translation.md index 7e692eb..0a1755e 100644 --- a/readme/nlp/machine_translation.md +++ b/readme/nlp/machine_translation.md @@ -1,13 +1,13 @@ # Machine Translation -- [2014 SSST] **On the properties of neural machine Translation Encoder-Decoder Approaches**, [[paper]](https://arxiv.org/abs/1409.1259). -- [2015 ICLR] **Neural Machine Translation by Jointly Learning to Align and Translate**, [[paper]](https://arxiv.org/abs/1409.0473), sources: [[lisa-groundhog/GroundHog]](https://github.com/lisa-groundhog/GroundHog/tree/master/experiments/nmt), [[tensorflow/nmt]](https://github.com/tensorflow/nmt). +- [2014 SSST] **On the properties of neural machine Translation Encoder-Decoder Approaches**, [[paper]](https://arxiv.org/pdf/1409.1259.pdf). +- [2015 ICLR] **Neural Machine Translation by Jointly Learning to Align and Translate**, [[paper]](https://arxiv.org/pdf/1409.0473.pdf), sources: [[lisa-groundhog/GroundHog]](https://github.com/lisa-groundhog/GroundHog/tree/master/experiments/nmt), [[tensorflow/nmt]](https://github.com/tensorflow/nmt). - [2015 EMNLP] **Effective Approaches to Attention-based Neural Machine Translation**, [[paper]](http://aclweb.org/anthology/D15-1166), [[HarvardNLP homepage]](http://nlp.seas.harvard.edu/code/), sources: [[dillonalaird/Attention]](https://github.com/dillonalaird/Attention), [[tensorflow/nmt]](https://github.com/tensorflow/nmt). -- [2017 ACL] **A Convolutional Encoder Model for Neural Machine Translation**, [[paper]](https://arxiv.org/abs/1611.02344), sources: [[facebookresearch/fairseq]](https://github.com/facebookresearch/fairseq). +- [2017 ACL] **A Convolutional Encoder Model for Neural Machine Translation**, [[paper]](https://arxiv.org/pdf/1611.02344.pdf), sources: [[facebookresearch/fairseq]](https://github.com/facebookresearch/fairseq). - [2017 NIPS] **Attention is All You Need**, [[paper]](https://papers.nips.cc/paper/7181-attention-is-all-you-need.pdf), [[Chinses blog]](http://www.cnblogs.com/robert-dlut/p/8638283.html), sources: [[Kyubyong/transformer]](https://github.com/Kyubyong/transformer), [[jadore801120/attention-is-all-you-need-pytorch]](https://github.com/jadore801120/attention-is-all-you-need-pytorch), [[DongjunLee/transformer-tensorflow]](https://github.com/DongjunLee/transformer-tensorflow). - [2017 EMNLP] **Neural Machine Translation with Word Predictions**, [[paper]](http://www.aclweb.org/anthology/D17-1013). - [2017 EMNLP] **Massive Exploration of Neural Machine Translation Architectures**, [[paper]](http://aclweb.org/anthology/D17-1151), [[homepage]](https://google.github.io/seq2seq/), sources: [[google/seq2seq]](https://github.com/google/seq2seq). - [2017 EMNLP] **Efficient Attention using a Fixed-Size Memory Representation**, [[paper]](http://aclweb.org/anthology/D17-1040). -- [2018 AMTA] **Context Models for OOV Word Translation in Low-Resource Language**, [[paper]](https://arxiv.org/abs/1801.08660). +- [2018 AMTA] **Context Models for OOV Word Translation in Low-Resource Language**, [[paper]](https://arxiv.org/pdf/1801.08660.pdf). - [2018 NAACL] **Self-Attention with Relative Position Representations**, [[paper]](https://arxiv.org/pdf/1803.02155.pdf). - [2018 COLING] **Double Path Networks for Sequence to Sequence Learning**, [[paper]](https://arxiv.org/pdf/1806.04856.pdf). \ No newline at end of file diff --git a/readme/nlp/sentiment_analysis.md b/readme/nlp/sentiment_analysis.md index 6c76f50..611ce8b 100644 --- a/readme/nlp/sentiment_analysis.md +++ b/readme/nlp/sentiment_analysis.md @@ -3,7 +3,7 @@ ## General - **Introduction to Sentiment Analysis**, [[slides]](https://lct-master.org/files/MullenSentimentCourseSlides.pdf), [[blog]](https://blog.algorithmia.com/introduction-sentiment-analysis-algorithms/). - **OSU Twitter NLP Tools**, [[aritter/twitter_nlp]](https://github.com/aritter/twitter_nlp). -- [2016 ACM] **Stance and Sentiment in Tweets**, [[paper]](https://arxiv.org/abs/1605.01655.pdf). +- [2016 ACM] **Stance and Sentiment in Tweets**, [[paper]](https://arxiv.org/pdf/1605.01655.pdf). ## Sentiment Analysis - [2013 EMNLP] **Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank**, [[paper]](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf), sources: [[rksltnl/RNTN]](https://github.com/rksltnl/RNTN), [[awni/semantic-rntn]](https://github.com/awni/semantic-rntn), [[rgobbel/rntn]](https://github.com/rgobbel/rntn). @@ -15,15 +15,15 @@ - [2016 ICDM] **Convolutional MKL Based Multimodal Emotion Recognition and Sentiment Analysis**, [[paper]](http://sentic.net/convolutional-mkl-based-multimodal-sentiment-analysis.pdf), sources: [[SenticNet/multimodal-sentiment-detection]](https://github.com/SenticNet/multimodal-sentiment-detection). - [2017 ICME] **Select-additive Learning: Improving Generalization in Multimodal Sentiment Analysis**, [[paper]](https://arxiv.org/pdf/1609.05244.pdf), sources: [[HaohanWang/SelectAdditiveLearning]](https://github.com/HaohanWang/SelectAdditiveLearning). - [2017 ICMI] **Multimodal Sentiment Analysis with Word-Level Fusion and Reinforcement Learning**, [[paper]](http://www.cs.cmu.edu/~pliang/papers/icmi2017-gme-camera.pdf). -- [2017 ACM SIGIR] **Multitask Learning for Fine-Grained Twitter Sentiment Analysis**, [[paper]](https://arxiv.org/abs/1707.03569.pdf), sources: [[balikasg/sigir2017]](https://github.com/balikasg/sigir2017). +- [2017 ACM SIGIR] **Multitask Learning for Fine-Grained Twitter Sentiment Analysis**, [[paper]](https://arxiv.org/pdf/1707.03569.pdf), sources: [[balikasg/sigir2017]](https://github.com/balikasg/sigir2017). - [2017 EMNLP] **Tensor Fusion Network for Multimodal Sentiment Analysis**, [[paper]](https://www.aclweb.org/anthology/D17-1115), sources: [[A2Zadeh/TensorFusionNetwork]](https://github.com/A2Zadeh/TensorFusionNetwork). - [2017 ACL] **Context-Dependent Sentiment Analysis in User-Generated Videos**, [[paper]](http://sentic.net/context-dependent-sentiment-analysis-in-user-generated-videos.pdf), sources: [[SenticNet/contextual-sentiment-analysis]](https://github.com/SenticNet/contextual-sentiment-analysis). -- [2018 ACL] **Multiple Instance Learning Networks for Fine-Grained Sentiment Analysis**, [[paper]](https://arxiv.org/abs/1711.09645.pdf), [[data]](https://github.com/EdinburghNLP/spot-data). +- [2018 ACL] **Multiple Instance Learning Networks for Fine-Grained Sentiment Analysis**, [[paper]](https://arxiv.org/pdf/1711.09645.pdf), [[data]](https://github.com/EdinburghNLP/spot-data). - [2018 AAAI] **Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM**, [[paper]](http://sentic.net/sentic-lstm.pdf). - [2018 Cognitive Computation] **Sentic LSTM: a Hybrid Network for Targeted Aspect-Based Sentiment Analysis**, [[paper]](https://link.springer.com/article/10.1007/s12559-018-9549-x), sources: [[SenticNet/sentic-lstm]](https://github.com/SenticNet/sentic-lstm). ## Stance Detection - [2016 SemEval] **SemEval-2016 Task 6: Detecting Stance in Tweets**, [[paper]](http://www.aclweb.org/anthology/S16-1003), [[homepage]](http://alt.qcri.org/semeval2016/task6/), [[The SemEval-2016 Stance Dataset]](http://www.saifmohammad.com/WebPages/StanceDataset.htm). -- [2016 SemEval] **DeepStance at SemEval-2016 Task 6: Detecting Stance in Tweets Using Character and Word-Level CNNs**, [[paper]](https://arxiv.org/abs/1606.05694). +- [2016 SemEval] **DeepStance at SemEval-2016 Task 6: Detecting Stance in Tweets Using Character and Word-Level CNNs**, [[paper]](https://arxiv.org/pdf/1606.05694.pdf). - [2016 SEM@ACL] **Detecting Stance in Tweets And Analyzing its Interaction with Sentiment**, [[paper]](http://anthology.aclweb.org/S16-2021), sources: [[vishaalmohan/twitter-stance-detection]](https://github.com/vishaalmohan/twitter-stance-detection). -- [2018 ECIR] **Topical Stance Detection for Twitter: A Two-Phase LSTM Model Using Attention**, [[paper]](https://arxiv.org/abs/1801.03032). +- [2018 ECIR] **Topical Stance Detection for Twitter: A Two-Phase LSTM Model Using Attention**, [[paper]](https://arxiv.org/pdf/1801.03032.pdf). diff --git a/readme/nlp/sequence_labeling.md b/readme/nlp/sequence_labeling.md index 7807676..dcddbef 100644 --- a/readme/nlp/sequence_labeling.md +++ b/readme/nlp/sequence_labeling.md @@ -3,20 +3,20 @@ ## General - [2011 JMLR] **Natural Language Processing (Almost) from Scratch**, cover _Tagging, Chunking, Parsing, NER, SRL and etc._ tasks, [[paper]](https://arxiv.org/pdf/1103.0398.pdf), sources: [[attardi/deepnl]](https://github.com/attardi/deepnl). - [2012 Springer] **Supervised Sequence Labelling with Recurrent Neural Networks**, [[Alex Graves's Ph.D. Thesis]](https://www.cs.toronto.edu/~graves/phd.pdf). -- [2016 ArXiv] **Multi-Task Cross-Lingual Sequence Tagging from Scratch**, [[paper]](https://arxiv.org/abs/1603.06270). +- [2016 ArXiv] **Multi-Task Cross-Lingual Sequence Tagging from Scratch**, [[paper]](https://arxiv.org/pdf/1603.06270.pdf). - [2017 EMNLP] **A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks**, cover _Tagging, Chunking, Parsing, Relatedness, Entailment_ tasks, [[paper]](https://arxiv.org/pdf/1611.01587.pdf), [[blog]](https://theneuralperspective.com/2017/03/08/a-joint-many-task-model-growing-a-neural-network-for-multiple-nlp-tasks/), sources: [[rubythonode/joint-many-task-model]](https://github.com/rubythonode/joint-many-task-model). - [2018 NAACL] **Zero-shot Sequence Labeling: Transferring Knowledge from Sentences to Tokens**, [[paper]](http://aclweb.org/anthology/N18-1027). ## POS Tagging and Named Entity Recognition -- [2015 ArXiv] **Bidirectional LSTM-CRF Models for Sequence Tagging**, [[paper]](https://arxiv.org/abs/1508.01991.pdf), [[blog]](https://guillaumegenthial.github.io/sequence-tagging-with-tensorflow.html), sources: [[Hironsan/anago]](https://github.com/Hironsan/anago), [[guillaumegenthial/sequence_tagging]](https://github.com/guillaumegenthial/sequence_tagging). -- [2016 ACL] **Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss**, [[paper]](https://arxiv.org/abs/1604.05529.pdf), sources: [[bplank/bilstm-aux]](https://github.com/bplank/bilstm-aux). +- [2015 ArXiv] **Bidirectional LSTM-CRF Models for Sequence Tagging**, [[paper]](https://arxiv.org/pdf/1508.01991.pdf), [[blog]](https://guillaumegenthial.github.io/sequence-tagging-with-tensorflow.html), sources: [[Hironsan/anago]](https://github.com/Hironsan/anago), [[guillaumegenthial/sequence_tagging]](https://github.com/guillaumegenthial/sequence_tagging). +- [2016 ACL] **Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss**, [[paper]](https://arxiv.org/pdf/1604.05529.pdf), sources: [[bplank/bilstm-aux]](https://github.com/bplank/bilstm-aux). - [2016 ACL] **Named Entity Recognition with Bidirectional LSTM-CNNs**, [[paper]](https://www.aclweb.org/anthology/Q16-1026), sources: [[ThanhChinhBK/Ner-BiLSTM-CNNs]](https://github.com/ThanhChinhBK/Ner-BiLSTM-CNNs). -- [2016 NAACL] **Neural Architectures for Named Entity Recognition**, [[paper]](https://arxiv.org/abs/1603.01360), sources: [[clab/stack-lstm-ner]](https://github.com/clab/stack-lstm-ner), [[glample/tagger]](https://github.com/glample/tagger), [[marekrei/sequence-labeler]](https://github.com/marekrei/sequence-labeler). -- [2016 ACL] **End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF**, [[paper]](https://arxiv.org/abs/1603.01354), sources: [[LopezGG/NN_NER_tensorFlow]](https://github.com/LopezGG/NN_NER_tensorFlow). +- [2016 NAACL] **Neural Architectures for Named Entity Recognition**, [[paper]](https://arxiv.org/pdf/1603.01360.pdf), sources: [[clab/stack-lstm-ner]](https://github.com/clab/stack-lstm-ner), [[glample/tagger]](https://github.com/glample/tagger), [[marekrei/sequence-labeler]](https://github.com/marekrei/sequence-labeler). +- [2016 ACL] **End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF**, [[paper]](https://arxiv.org/pdf/1603.01354.pdf), sources: [[LopezGG/NN_NER_tensorFlow]](https://github.com/LopezGG/NN_NER_tensorFlow). - [2017 EMNLP] **Part-of-Speech Tagging for Twitter with Adversarial Neural Networks**, [[paper]](https://www.aclweb.org/anthology/D17-1256), sources: [[guitaowufeng/TPANN]](https://github.com/guitaowufeng/TPANN). -- [2017 EMNLP] **Fast and Accurate Entity Recognition with Iterated Dilated Convolutions**, [[paper]](https://arxiv.org/abs/1702.02098), sources: [[iesl/dilated-cnn-ner]](https://github.com/iesl/dilated-cnn-ner). +- [2017 EMNLP] **Fast and Accurate Entity Recognition with Iterated Dilated Convolutions**, [[paper]](https://arxiv.org/pdf/1702.02098.pdf), sources: [[iesl/dilated-cnn-ner]](https://github.com/iesl/dilated-cnn-ner). - [2017 ICLR] **Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks**, [[paper]](https://arxiv.org/pdf/1703.06345.pdf), sources: [[kimiyoung/transfer]](https://github.com/kimiyoung/transfer). -- [2017 EMNLP] **Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks**, [[paper]](https://arxiv.org/abs/1707.06799), sources: [[UKPLab/emnlp2017-bilstm-cnn-crf]](https://github.com/UKPLab/emnlp2017-bilstm-cnn-crf). +- [2017 EMNLP] **Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks**, [[paper]](https://arxiv.org/pdf/1707.06799.pdf), sources: [[UKPLab/emnlp2017-bilstm-cnn-crf]](https://github.com/UKPLab/emnlp2017-bilstm-cnn-crf). - [2017 InterSpeech] **Label-dependency coding in Simple Recurrent Networks for Spoken Language Understanding**, [[paper]](https://hal.inria.fr/hal-01553830/document). - [2017 ACL] **Model Transfer for Tagging Low-resource Languages using a Bilingual Dictionary**, [[paper]](http://aclweb.org/anthology/P17-2093), sources: [[mengf1/trpos]](https://github.com/mengf1/trpos). - [2017 EMNLP] **Semi-Supervised Structured Prediction with Neural CRF Autoencoder**, [[paper]](http://aclweb.org/anthology/D17-1179), sources: [[cosmozhang/NCRF-AE]](https://github.com/cosmozhang/NCRF-AE). @@ -44,7 +44,7 @@ ## Semantic Role Labeling - [2015 ACL] **End-to-end Learning of Semantic Role Labeling using RNN**, [[paper]](http://www.aclweb.org/anthology/P15-1109), sources: [[sanjaymeena/semantic_role_labeling_deep_learning]](https://github.com/sanjaymeena/semantic_role_labeling_deep_learning), [[hiroki13/neural-semantic-role-labeler]](https://github.com/hiroki13/neural-semantic-role-labeler). -- [2016 ACL] **Neural Semantic Role Labeling with Dependency Path Embeddings**, [[paper]](https://arxiv.org/abs/1605.07515), sources: [[microth/PathLSTM]](https://github.com/microth/PathLSTM). +- [2016 ACL] **Neural Semantic Role Labeling with Dependency Path Embeddings**, [[paper]](https://arxiv.org/pdf/1605.07515.pdf), sources: [[microth/PathLSTM]](https://github.com/microth/PathLSTM). - [2017 ACL] **Deep Semantic Role Labeling: What Works and Whats Next**, [[paper]](https://homes.cs.washington.edu/~luheng/files/acl2017_hllz.pdf), sources: [[luheng/deep_srl]](https://github.com/luheng/deep_srl). - [2018 AAAI] **Deep Semantic Role Labeling with Self-Attention**, [[paper]](https://arxiv.org/pdf/1712.01586.pdf), sources: [[XMUNLP/Tagger]](https://github.com/XMUNLP/Tagger). diff --git a/readme/nlp/text_classification.md b/readme/nlp/text_classification.md index 23a2029..fa12313 100644 --- a/readme/nlp/text_classification.md +++ b/readme/nlp/text_classification.md @@ -1,12 +1,12 @@ # Text, Sentence and Document Classification -- [2014 EMNLP] **Convolutional Neural Networks for Sentence Classification**, [[paper]](https://arxiv.org/abs/1408.5882), sources: [[yoonkim/CNN_sentence]](https://github.com/yoonkim/CNN_sentence), [[dennybritz/cnn-text-classification-tf]](https://github.com/dennybritz/cnn-text-classification-tf). +- [2014 EMNLP] **Convolutional Neural Networks for Sentence Classification**, [[paper]](https://arxiv.org/pdf/1408.5882.pdf), sources: [[yoonkim/CNN_sentence]](https://github.com/yoonkim/CNN_sentence), [[dennybritz/cnn-text-classification-tf]](https://github.com/dennybritz/cnn-text-classification-tf). - [2015 ACL] **Deep Unordered Composition Rivals Syntactic Methods for Text Classification**, [[paper]](https://www.cs.umd.edu/~miyyer/pubs/2015_acl_dan.pdf), [[slides]](https://pdfs.semanticscholar.org/7a5d/565e7abeb5e4570c1222dba0e5b1df18664a.pdf), sources: [[miyyer/dan]](https://github.com/miyyer/dan). - [2015 AAAI] **Recurrent Convolutional Neural Networks for Text Classification**, [[paper]](https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9745/9552), sources: [[knok/rcnn-text-classification]](https://github.com/knok/rcnn-text-classification), [[airalcorn2/Recurrent-Convolutional-Neural-Network-Text-Classifier]](https://github.com/airalcorn2/Recurrent-Convolutional-Neural-Network-Text-Classifier). - [2016 NAACL] **Hierarchical Attention Networks for Document Classification**, [[paper]](https://www.cs.cmu.edu/%7Ediyiy/docs/naacl16.pdf), sources: [[richliao/textClassifier]](https://github.com/richliao/textClassifier), [[ematvey/hierarchical-attention-networks]](https://github.com/ematvey/hierarchical-attention-networks). -- [2017 EACL] **Bag of Tricks for Efficient Text Classification**, [[paper]](https://arxiv.org/abs/1607.01759), sources: [[facebookresearch/fastText]](https://github.com/facebookresearch/fastText). -- [2017 ArXiv] **Which Encoding is the Best for Text Classification in Chinese, English, Japanese and Korean?**, [[paper]](https://arxiv.org/abs/1708.02657), sources: [[zhangxiangxiao/glyph]](https://github.com/zhangxiangxiao/glyph). +- [2017 EACL] **Bag of Tricks for Efficient Text Classification**, [[paper]](https://arxiv.org/pdf/1607.01759.pdf), sources: [[facebookresearch/fastText]](https://github.com/facebookresearch/fastText). +- [2017 ArXiv] **Which Encoding is the Best for Text Classification in Chinese, English, Japanese and Korean?**, [[paper]](https://arxiv.org/pdf/1708.02657.pdf), sources: [[zhangxiangxiao/glyph]](https://github.com/zhangxiangxiao/glyph). - [2017 ArXiv] **Multi-Task Label Embedding for Text Classification**, [[paper]](https://arxiv.org/pdf/1710.07210.pdf), [[blog]](https://www.jianshu.com/p/4bbe061f0acd). - [2017 ICLR] **Adversarial Training Methods For Semi-Supervised Text Classification**, [[paper]](https://arxiv.org/pdf/1605.07725.pdf), sources: [[TobiasLee/Text-Classification]](https://github.com/TobiasLee/Text-Classification). -- [2018 ArXiv] **Densely Connected Bidirectional LSTM with Applications to Sentence Classification**, [[paper]](https://arxiv.org/abs/1802.00889), source: [[IsaacChanghau/Dense_BiLSTM]](https://github.com/IsaacChanghau/Dense_BiLSTM). +- [2018 ArXiv] **Densely Connected Bidirectional LSTM with Applications to Sentence Classification**, [[paper]](https://arxiv.org/pdf/1802.00889.pdf), source: [[IsaacChanghau/Dense_BiLSTM]](https://github.com/IsaacChanghau/Dense_BiLSTM). - [2018 NAACL] **Multinomial Adversarial Networks for Multi-Domain Text Classification**, [[paper]](http://aclweb.org/anthology/N18-1111), sources: [[ccsasuke/man]](https://github.com/ccsasuke/man). \ No newline at end of file