Mostrar el registro sencillo del ítem

dc.rights.licensehttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.contributor.authorLópez-Albán D.
dc.contributor.authorLópez-Barrera A.
dc.contributor.authorMayorca-Torres D.
dc.contributor.authorPeluffo-Ordóñez D.
dc.contributor.editorFlorez H.
dc.contributor.editorPollo-Cattaneo M.F.
dc.contributor.other4th International Conference on Applied Informatics, ICAI 2021
dc.date.accessioned2024-12-02T20:15:54Z
dc.date.available2024-12-02T20:15:54Z
dc.date.issued2021
dc.identifier.isbn978-303089653-9
dc.identifier.issn18650929
dc.identifier.urihttps://hdl.handle.net/20.500.14112/28986
dc.description.abstractThe abstract should briefly summarize the contents of the paper in Sign language is the form of communication between the deaf and hearing population, which uses the gesture-spatial configuration of the hands as a communication channel with their social environment. This work proposes the development of a gesture recognition method associated with sign language from the processing of time series from the spatial position of hand reference points granted by a Leap Motion optical sensor. A methodology applied to a validated American Sign Language (ASL) Dataset which involves the following sections: (i) preprocessing for filtering null frames, (ii) segmentation of relevant information, (iii) time-frequency characterization from the Discrete Wavelet Transform (DWT). Subsequently, the classification is carried out with Machine Learning algorithms (iv). It is graded by a 97.96% rating yield using the proposed methodology with the Fast Tree algorithm. © 2021, Springer Nature Switzerland AG.
dc.format12
dc.format.mediumRecurso electrónico
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherSpringer Science and Business Media Deutschland GmbH
dc.rights.uriAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
dc.sourceCommunications in Computer and Information Science
dc.sourceCommun. Comput. Info. Sci.
dc.sourceScopus
dc.titleSign Language Recognition Using Leap Motion Based on Time-Frequency Characterization and Conventional Machine Learning Techniques
datacite.contributorUniversidad Mariana, Pasto, 520001, Colombia
datacite.contributorMohammed VI Polytechnic University, Ben Guerir, 47963, Morocco
datacite.contributorSDAS Research Group, Ben Guerir, 47963, Morocco
datacite.contributorLópez-Albán D., Universidad Mariana, Pasto, 520001, Colombia
datacite.contributorLópez-Barrera A., Universidad Mariana, Pasto, 520001, Colombia
datacite.contributorMayorca-Torres D., Universidad Mariana, Pasto, 520001, Colombia, SDAS Research Group, Ben Guerir, 47963, Morocco
datacite.contributorPeluffo-Ordóñez D., Mohammed VI Polytechnic University, Ben Guerir, 47963, Morocco, SDAS Research Group, Ben Guerir, 47963, Morocco
datacite.contributor4th International Conference on Applied Informatics, ICAI 2021
datacite.rightshttp://purl.org/coar/access_right/c_abf2
oaire.resourcetypehttp://purl.org/coar/resource_type/c_c94f
oaire.versionhttp://purl.org/coar/version/c_ab4af688f83e57aa
dc.contributor.contactpersonD. López-Albán
dc.contributor.contactpersonUniversidad Mariana, Pasto, 520001, Colombia
dc.contributor.contactpersonemail: diegoanlopez@umariana.edu.co
dc.identifier.doi10.1007/978-3-030-89654-6_5
dc.identifier.instnameUniversidad Mariana
dc.identifier.reponameRepositorio Clara de Asis
dc.identifier.urlhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85119001753&doi=10.1007%2f978-3-030-89654-6_5&partnerID=40&md5=4f87d12d91e92c18dbeea1f7bf1229d0
dc.relation.citationendpage67
dc.relation.citationstartpage55
dc.relation.citationvolume1455 CCIS
dc.relation.conferencedate28 October 2021 through 30 October 2021
dc.relation.conferenceplaceBuenos Aires
dc.relation.iscitedby1
dc.relation.referencesDeafness and Hearing Loss, (2021)
dc.relation.referencesAl-Hammadi M., Et al., Deep learning-based approach for sign language gesture recognition with efficient hand gesture representation, IEEE Access, 8, pp. 192527-192542, (2020)
dc.relation.referencesCheok M.J., Omar Z., Jaward M.H., A review of hand gesture and sign language recognition techniques, Int. J. Mach. Learn. Cybern., 10, 1, pp. 131-153, (2017)
dc.relation.referencesZafrulla Z., Brashear H., Starner T., Hamilton H., Presti P., American sign language recognition with the kinect, Proceedings of the 13Th International Conference on Multimodal Interfaces, pp. 279-286, (2011)
dc.relation.referencesChong T.W., Lee B.G., American sign language recognition using leap motion controller with a machine learning approach, Sensors, 18, 10, (2018)
dc.relation.referencesWeichert F., Bachmann D., Rudak B., Fisseler D., Analysis of the accuracy and robustness of the leap motion controller, Sensors, 13, 5, pp. 6380-6393, (2013)
dc.relation.referencesLei L., Dashun Q., Design of data-glove and Chinese sign language recognition system based on ARM9, 2015 12Th IEEE International Conference on Electronic Measurement & Instruments (ICEMI), Vol. 3, Pp. 1130-1134. IEEE, (2015)
dc.relation.referencesMarin G., Dominio F., Zanuttigh P., Hand gesture recognition with leap motion and kinect devices, In: 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565-1569
dc.relation.referencesShin H., Kim W.J., Jang K.A., Korean sign language recognition based on image and convolution neural network, Proceedings of the 2Nd International Conference on Image and Graphics Processing, pp. 52-55, (2019)
dc.relation.referencesWeerasekera C.S., Jaward M.H., Kamrani N., Robust asl fingerspelling recognition using local binary patterns and geometric features, 2013 International Conference on Digital Image Computing: Techniques and Applications (DICTA), pp. 1-8, (2013)
dc.relation.referencesRavi S., Suman M., Kishore P.V.V., Kumar E.K., Kumar M.T.K., Et al., Multi modal spatio temporal cotrained CNNs with single modal testing on RGB-D based sign language gesture recognition, J. Comput. Lang., 52, pp. 88-102, (2019)
dc.relation.referencesSu Y., Qing Z., Continuous Chinese sign language recognition with CNN-LSTM, In: Proceedings of SPIE 10420, Ninth International Conference on Digital Image Processing (ICDIP 2017)
dc.relation.referencesHernandez V., Suzuki T., Venture G., Convolutional and recurrent neural network for human activity recognition: Application on American sign language, Plos ONE, 15, 2, (2020)
dc.relation.referencesShanmuganathan V., Yesudhas H.R., Khan M.S., Khari M., Gandomi A.H., R-CNN and wavelet feature extraction for hand gesture recognition with EMG signals, Neural Comput. Appl., 32, 21, pp. 16723-16736, (2020)
dc.relation.referencesHernandez V., Suzuki T., Venture G., American Sign Language Classification-Leapmotion-25 Subjects-60 Signs. Mendeley Data, (2018)
dc.relation.referencesVysocky A., Grushko S., Oscadal P., Kot T., Babjak J., Janos R., Sukop M., Bobovsky Z., Analysis of precision and stability of hand tracking with leap motion sensor, Sensors, 2020, 20, (2020)
dc.rights.accessrightsinfo:eu-repo/semantics/openAccess
dc.subject.keywordsDiscrete wavelet transform
dc.subject.keywordsLeap motion
dc.subject.keywordsMachine learning
dc.subject.keywordsSign language
dc.subject.keywordsAudition
dc.subject.keywordsDiscrete wavelet transforms
dc.subject.keywordsInformation filtering
dc.subject.keywordsLearning algorithms
dc.subject.keywordsSignal reconstruction
dc.subject.keywordsTrees (mathematics)
dc.subject.keywordsCommunications channels
dc.subject.keywordsConventional machines
dc.subject.keywordsDiscrete-wavelet-transform
dc.subject.keywordsLeap motion
dc.subject.keywordsMachine learning techniques
dc.subject.keywordsSign language
dc.subject.keywordsSign Language recognition
dc.subject.keywordsSocial environment
dc.subject.keywordsSpatial configuration
dc.subject.keywordsTime-frequency characterization
dc.subject.keywordsMachine learning
dc.type.driverinfo:eu-repo/semantics/conferenceObject
dc.type.hasversioninfo:eu-repo/semantics/acceptedVersion
dc.type.redcolhttp://purl.org/redcol/resource_type/ARTDATA
dc.type.spaContribución a congreso / Conferencia


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem