Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Article PDF (English)

Palabras clave

machine learning
hyper parameter optimization
grid search
random search
BO-GP

Cómo citar

Md Riyad Hossain, & Dr. Douglas Timmer. (2021). Machine Learning Model Optimization with Hyper Parameter Tuning Approach. Revista Global De Ciencia Y tecnología informática, 21(D2), 7–13. Recuperado a partir de https://gjcst.com/index.php/gjcst/article/view/2059

Resumen

Hyper-parameters tuning is a key step to find the optimal machine learning parameters Determining the best hyper-parameters takes a good deal of time especially when the objective functions are costly to determine or a large number of parameters are required to be tuned In contrast to the conventional machine learning algorithms Neural Network requires tuning hyperparameters more because it has to process a lot of parameters together and depending on the fine tuning the accuracy of the model can be varied in between 25 -90 A few of the most effective techniques for tuning hyper-parameters in the Deep learning methods are Grid search Random forest Bayesian optimization etc Every method has some advantages and disadvantages over others For example Grid search has proven to be an effective technique to tune hyper-parameters along with drawbacks like trying too many combinations and performing poorly when it is required to tune many parameters at a time In our work we will determine show and analyze the efficiencies of a real-world synthetic polymer dataset for different parameters and tuning methods
Article PDF (English)
Creative Commons License

Esta obra está bajo una licencia internacional Creative Commons Atribución 4.0.

Derechos de autor 2021 Authors and Global Journals Private Limited