Elastic exponential linear units for convolutional neural networks SCIE SCOPUS

DC Field Value Language
dc.contributor.author 김대호 -
dc.contributor.author 김진아 -
dc.contributor.author 김재일 -
dc.date.accessioned 2020-12-10T07:46:19Z -
dc.date.available 2020-12-10T07:46:19Z -
dc.date.created 2020-05-13 -
dc.date.issued 2020-09 -
dc.identifier.issn 0925-2312 -
dc.identifier.uri https://sciwatch.kiost.ac.kr/handle/2020.kiost/38589 -
dc.description.abstract Activation functions play an important role in determining the depth and nonlinearity of deep learning models. Since Rectified Linear Unit (ReLU) was introduced, many modifications in which noise is intentionally injected have been proposed to avoid overfitting risks. Furthermore, Exponential Linear Unit (ELU) and its variants with trainable parameters have been proposed to reduce the bias shift effects which are often observed from ReLU-type activation functions. In this paper, we propose a novel activation function, called Elastic Exponential Linear Unit (EELU), which combines the advantages of both types of the activation functions in a generalized form. EELU not only has an elastic slope in the positive part, but also preserves the negative signal by a small nonzero gradient. We also present a new strategy to insert neuronal noise following Gaussian distribution in the activation function to improve generalization. In our experiments, we demonstrate how the EELU can represent a wider variety of features with random noise than other activation functions by visualizing the latent features of convolutional neural networks. We evaluate the effectiveness of the EELU through extensive experiments with image classification using CIFAR-10/CIFAR-100, ImageNet, and Tiny ImageNet. Experimental results show that the EELU achieved generalization performance and improved the classification accuracy over the conventional and re -
dc.description.uri 1 -
dc.language English -
dc.publisher ELSEVIER -
dc.title Elastic exponential linear units for convolutional neural networks -
dc.type Article -
dc.citation.endPage 266 -
dc.citation.startPage 253 -
dc.citation.title NEUROCOMPUTING -
dc.citation.volume 406 -
dc.contributor.alternativeName 김진아 -
dc.identifier.bibliographicCitation NEUROCOMPUTING, v.406, pp.253 - 266 -
dc.identifier.doi 10.1016/j.neucom.2020.03.051 -
dc.identifier.scopusid 2-s2.0-85085103725 -
dc.identifier.wosid 000541716500012 -
dc.type.docType Article -
dc.description.journalClass 1 -
dc.description.isOpenAccess N -
dc.subject.keywordPlus NOISE -
dc.relation.journalWebOfScienceCategory Computer Science, Artificial Intelligence -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.relation.journalResearchArea Computer Science -
Appears in Collections:
Sea Power Enhancement Research Division > Coastal Disaster & Safety Research Department > 1. Journal Articles
Files in This Item:
There are no files associated with this item.

qrcode

Items in ScienceWatch@KIOST are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse