TY - JOUR
T1 - An LP-based hyperparameter optimization model for language modeling
AU - Rahnama, Amir Hossein Akhavan
AU - Toloo, Mehdi
AU - Zaidenberg, Nezer Jacob
N1 - Publisher Copyright:
© 2018, Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2018/5/1
Y1 - 2018/5/1
N2 - In order to find hyperparameters for a machine learning model, algorithms such as grid search or random search are used over the space of possible values of the models’ hyperparameters. These search algorithms opt the solution that minimizes a specific cost function. In language models, perplexity is one of the most popular cost functions. In this study, we propose a fractional nonlinear programming model that finds the optimal perplexity value. The special structure of the model allows us to approximate it by a linear programming model that can be solved using the well-known simplex algorithm. To the best of our knowledge, this is the first attempt to use optimization techniques to find perplexity values in the language modeling literature. We apply our model to find hyperparameters of a language model and compare it to the grid search algorithm. Furthermore, we illustrate that it results in lower perplexity values. We perform this experiment on a real-world dataset from SwiftKey to validate our proposed approach.
AB - In order to find hyperparameters for a machine learning model, algorithms such as grid search or random search are used over the space of possible values of the models’ hyperparameters. These search algorithms opt the solution that minimizes a specific cost function. In language models, perplexity is one of the most popular cost functions. In this study, we propose a fractional nonlinear programming model that finds the optimal perplexity value. The special structure of the model allows us to approximate it by a linear programming model that can be solved using the well-known simplex algorithm. To the best of our knowledge, this is the first attempt to use optimization techniques to find perplexity values in the language modeling literature. We apply our model to find hyperparameters of a language model and compare it to the grid search algorithm. Furthermore, we illustrate that it results in lower perplexity values. We perform this experiment on a real-world dataset from SwiftKey to validate our proposed approach.
KW - Hyperparameter optimization
KW - Language model
KW - Linear programming
KW - Machine learning
KW - Optimization
KW - n-Grams
UR - http://www.scopus.com/inward/record.url?scp=85040232951&partnerID=8YFLogxK
U2 - 10.1007/s11227-018-2236-6
DO - 10.1007/s11227-018-2236-6
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85040232951
SN - 0920-8542
VL - 74
SP - 2151
EP - 2160
JO - Journal of Supercomputing
JF - Journal of Supercomputing
IS - 5
ER -