Handwritten Digit Recognition Using Machine Learning Algorithms
Handwritten Digit Recognition Using Machine Learning Algorithms
Article PDF

Keywords

pattern recognition
handwritten recognition
digit recognition
machine learning
WEKA
off-line handwritten recognition
machine learning algorithm

How to Cite

S M Shamim. (2018). Handwritten Digit Recognition Using Machine Learning Algorithms. Global Journal of Computer Science and Technology, 18(D1), 17–23. Retrieved from https://gjcst.com/index.php/gjcst/article/view/539

Abstract

Handwritten character recognition is one of the practically important issues in pattern recognition applications The applications of digit recognition includes in postal mail sorting bank check processing form data entry etc The heart of the problem lies within the ability to develop an efficient algorithm that can recognize hand written digits and which is submitted by users by the way of a scanner tablet and other digital devices This paper presents an approach to off-line handwritten digit recognition based on different machine learning technique The main objective of this paper is to ensure effective and reliable approaches for recognition of handwritten digits Several machines learning algorithm namely Multilayer Perceptron Support Vector Machine Na ve Bayes Bayes Net Random Forest J48 and Random Tree has been used for the recognition of digits using WEKA The result of this paper shows that highest 90 37 accuracy has been obtained for Multilayer Perceptron
Article PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright (c) 2018 Authors and Global Journals Private Limited