Handwritten Digit Recognition Using Machine Learning Algorithms
Handwritten Digit Recognition Using Machine Learning Algorithms
Article PDF (English)

Ключевые слова

pattern recognition
handwritten recognition
digit recognition
machine learning
WEKA
off-line handwritten recognition
machine learning algorithm

Как цитировать

S M Shamim. (2018). Handwritten Digit Recognition Using Machine Learning Algorithms. Глобальный журнал компьютерных наук и технологий, 18(D1), 17–23. извлечено от https://gjcst.com/index.php/gjcst/article/view/539

Аннотация

Handwritten character recognition is one of the practically important issues in pattern recognition applications The applications of digit recognition includes in postal mail sorting bank check processing form data entry etc The heart of the problem lies within the ability to develop an efficient algorithm that can recognize hand written digits and which is submitted by users by the way of a scanner tablet and other digital devices This paper presents an approach to off-line handwritten digit recognition based on different machine learning technique The main objective of this paper is to ensure effective and reliable approaches for recognition of handwritten digits Several machines learning algorithm namely Multilayer Perceptron Support Vector Machine Na ve Bayes Bayes Net Random Forest J48 and Random Tree has been used for the recognition of digits using WEKA The result of this paper shows that highest 90 37 accuracy has been obtained for Multilayer Perceptron
Article PDF (English)
Лицензия Creative Commons

Это произведение доступно по лицензии Creative Commons «Attribution» («Атрибуция») 4.0 Всемирная.

Copyright (c) 2018 Authors and Global Journals Private Limited