Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Machine Learning Model Optimization with Hyper Parameter Tuning Approach
Article PDF (English)

Ключевые слова

machine learning
hyper parameter optimization
grid search
random search
BO-GP

Как цитировать

Md Riyad Hossain, & Dr. Douglas Timmer. (2021). Machine Learning Model Optimization with Hyper Parameter Tuning Approach. Глобальный журнал компьютерных наук и технологий, 21(D2), 7–13. извлечено от https://gjcst.com/index.php/gjcst/article/view/2059

Аннотация

Hyper-parameters tuning is a key step to find the optimal machine learning parameters Determining the best hyper-parameters takes a good deal of time especially when the objective functions are costly to determine or a large number of parameters are required to be tuned In contrast to the conventional machine learning algorithms Neural Network requires tuning hyperparameters more because it has to process a lot of parameters together and depending on the fine tuning the accuracy of the model can be varied in between 25 -90 A few of the most effective techniques for tuning hyper-parameters in the Deep learning methods are Grid search Random forest Bayesian optimization etc Every method has some advantages and disadvantages over others For example Grid search has proven to be an effective technique to tune hyper-parameters along with drawbacks like trying too many combinations and performing poorly when it is required to tune many parameters at a time In our work we will determine show and analyze the efficiencies of a real-world synthetic polymer dataset for different parameters and tuning methods
Article PDF (English)
Лицензия Creative Commons

Это произведение доступно по лицензии Creative Commons «Attribution» («Атрибуция») 4.0 Всемирная.

Copyright (c) 2021 Authors and Global Journals Private Limited