Please use this identifier to cite or link to this item: http://repository.futminna.edu.ng:8080/jspui/handle/123456789/17072
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDogo, E. M.-
dc.contributor.authorAfolabi, O. J.-
dc.contributor.authorTwala, B.-
dc.date.accessioned2023-01-12T15:00:02Z-
dc.date.available2023-01-12T15:00:02Z-
dc.date.issued2022-11-23-
dc.identifier.urihttp://repository.futminna.edu.ng:8080/jspui/handle/123456789/17072-
dc.descriptionSection: Computing and Artificial Intelligence; Special Issue: Deep Learning Architectures for Computer Visionen_US
dc.description.abstractThe continued increase in computing resources is one key factor that is allowing deep learning researchers to scale, design and train new and complex convolutional neural network (CNN) architectures in terms of varying width, depth, or both width and depth to improve performance for a variety of problems. The contributions of this study include an uncovering of how different optimization algorithms impact CNN architectural setups with variations in width, depth, and both width/depth. Specifically in this study, three different CNN architectural setups in combination with nine different optimization algorithms—namely SGD vanilla, with momentum, and with Nesterov momentum, RMSProp, ADAM, ADAGrad, ADADelta, ADAMax, and NADAM—are trained and evaluated using three publicly available benchmark image classification datasets. Through extensive experimentation, we analyze the output predictions of the different optimizers with the CNN architectures using accuracy, convergence speed, and loss function as performance metrics. Findings based on the overall results obtained across the three image classification datasets show that ADAM and NADAM achieved superior performances with wider and deeper/wider setups, respectively, while ADADelta was the worst performer, especially with the deeper CNN architectural setup.en_US
dc.language.isoenen_US
dc.publisherMDPI Applied Sciencesen_US
dc.relation.ispartofseries12;11976-
dc.subjectoptimization algorithmsen_US
dc.subjectneural networken_US
dc.subjectnetwork sizeen_US
dc.subjectperformance analysisen_US
dc.subjectimage classificationen_US
dc.titleOn the Relative Impact of Optimizers on Convolutional Neural Networks with Varying Depth and Width for Image Classificationen_US
dc.typeArticleen_US
Appears in Collections:Computer Engineering



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.