Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Article PDF

Keywords

machine learning
hyper parameter optimization
grid search
random search
BO-GP

How to Cite

Md Riyad Hossain, & Dr. Douglas Timmer. (2021). Machine Learning Model Optimization with Hyper Parameter Tuning Approach. Global Journal of Computer Science and Technology, 21(D2), 7–13. Retrieved from https://gjcst.com/index.php/gjcst/article/view/2059

Abstract

Hyper-parameters tuning is a key step to find the optimal machine learning parameters Determining the best hyper-parameters takes a good deal of time especially when the objective functions are costly to determine or a large number of parameters are required to be tuned In contrast to the conventional machine learning algorithms Neural Network requires tuning hyperparameters more because it has to process a lot of parameters together and depending on the fine tuning the accuracy of the model can be varied in between 25 -90 A few of the most effective techniques for tuning hyper-parameters in the Deep learning methods are Grid search Random forest Bayesian optimization etc Every method has some advantages and disadvantages over others For example Grid search has proven to be an effective technique to tune hyper-parameters along with drawbacks like trying too many combinations and performing poorly when it is required to tune many parameters at a time In our work we will determine show and analyze the efficiencies of a real-world synthetic polymer dataset for different parameters and tuning methods
Article PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright (c) 2021 Authors and Global Journals Private Limited