[Demo] A Hyperparameter Optimization Toolkit for Neural Machine Translation Research
Xuan Zhang, Kevin Duh, Paul McNamee
Demo: Machine Translation (demo) Demo Paper
Demo Session 4: Machine Translation (demo) (Poster)
Conference Room: Frontenac Ballroom and Queen's Quay
Conference Time: July 11, 11:00-12:30 (EDT) (America/Toronto)
Global Time: July 11, Demo Session 4 (15:00-16:30 UTC)
TLDR:
Hyperparameter optimization is an important but often overlooked process in the research of deep learning technologies. To obtain a good model, one must carefully tune hyperparameters that determine the architecture and training algorithm. Insufficient tuning may result in poor results, while inequi...
You can open the
#paper-D45
channel in a separate window.
Abstract:
Hyperparameter optimization is an important but often overlooked process in the research of deep learning technologies. To obtain a good model, one must carefully tune hyperparameters that determine the architecture and training algorithm. Insufficient tuning may result in poor results, while inequitable tuning may lead to exaggerated differences between models. We present a hyperparameter optimization toolkit for neural machine translation (NMT) to help researchers focus their time on the creative rather than the mundane. The toolkit is implemented as a wrapper on top of the open-source Sockeye NMT software. Using the Asynchronous Successive Halving Algorithm (ASHA), we demonstrate that it is possible to discover near-optimal models under a computational budget with little effort.
Code: https://github.com/kevinduh/sockeye-recipes3
Video demo: https://cs.jhu.edu/~kevinduh/j/demo.mp4