Skip to content
/ OpenNMT Public
forked from quanpn90/OpenNMT

Open-Source Neural Machine Translation in Torch

License

Notifications You must be signed in to change notification settings

isl-mt/OpenNMT

This branch is up to date with quanpn90/OpenNMT:master.

Folders and files

NameName
Last commit message
Last commit date
Feb 17, 2017
May 19, 2017
Dec 28, 2016
Feb 17, 2017
May 15, 2017
Apr 30, 2017
Feb 2, 2017
Feb 21, 2017
Feb 16, 2017
Nov 27, 2016
Dec 5, 2016
Dec 26, 2016
Jan 23, 2017
Feb 21, 2017
Nov 16, 2016
Jan 18, 2017
Jan 4, 2017

Repository files navigation

Build Status

OpenNMT: Open-Source Neural Machine Translation

OpenNMT is a full-featured, open-source (MIT) neural machine translation system utilizing the Torch mathematical toolkit.

The system is designed to be simple to use and easy to extend , while maintaining efficiency and state-of-the-art translation accuracy. Features include:

  • Speed and memory optimizations for high-performance GPU training.
  • Simple general-purpose interface, only requires and source/target data files.
  • C++ implementation of the translator for easy deployment.
  • Extensions to allow other sequence generation tasks such as summarization and image captioning.

Installation

OpenNMT only requires a vanilla Torch install with few dependencies. Alternatively there is a (CUDA) Docker container.

Dependencies

  • nn
  • nngraph
  • tds
  • penlight

GPU training requires:

  • cunn
  • cutorch

Multi-GPU training additionally requires:

  • threads

Quickstart

OpenNMT consists of three commands:

  1. Preprocess the data.

th preprocess.lua -train_src data/src-train.txt -train_tgt data/tgt-train.txt -valid_src data/src-val.txt -valid_tgt data/tgt-val.txt -save_data data/demo

  1. Train the model.

th train.lua -data data/demo-train.t7 -save_model model

  1. Translate sentences.

th translate.lua -model model_final.t7 -src data/src-test.txt -output pred.txt

See the guide for more details.

Citation

A technical report on OpenNMT is available. If you use the system for academic work, please cite:

    @ARTICLE{2017opennmt,
         author = { {Klein}, G. and {Kim}, Y. and {Deng}, Y. 
                    and {Senellart}, J. and {Rush}, A.~M.},
         title = "{OpenNMT: Open-Source Toolkit 
                   for Neural Machine Translation}",
         journal = {ArXiv e-prints},
         eprint = {1701.02810} }

Documentation

About

Open-Source Neural Machine Translation in Torch

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Lua 96.7%
  • Python 1.4%
  • Perl 1.3%
  • Shell 0.6%