Mostrar el registro sencillo del ítem

dc.rights.licensehttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.contributor.authorLopez D.A.
dc.contributor.authorLopez M.A.
dc.contributor.authorMuñoz D.S.
dc.contributor.authorSanta J.A.
dc.contributor.authorGomez D.F.
dc.contributor.authorBarone D.
dc.contributor.authorTorresen J.
dc.contributor.authorSalas J.A.R.
dc.contributor.editorRibeiro P.R.
dc.contributor.editorCota V.R.
dc.contributor.editorBarone D.A.
dc.contributor.editorde Oliveira A.C.
dc.contributor.other3rd Latin American Workshop on Computational Neuroscience, LAWCN 2021
dc.date.accessioned2024-12-02T20:16:05Z
dc.date.available2024-12-02T20:16:05Z
dc.date.issued2022
dc.identifier.isbn978-303108442-3
dc.identifier.issn18650929
dc.identifier.urihttps://hdl.handle.net/20.500.14112/29018
dc.description.abstractEach day, robotic systems are becoming more familiar and common in different contexts such as factories, hospitals, houses, and restaurants, creating a necessity of seeking for affordable and intuitive interface for effective and engaging communication with humans. Likewise, innovative devices that offer alternative methods of interacting with machines allow us to create new interfaces, improving the learning training and motion application. Thus, this paper compares two interaction modes using leap motion to control a robotic manipulator (UR3) simulator. Users can control the robot through numerical gestures to set up the angle joints (coded mode) or counter/clockwise gestures to increase or decrease the angle values (open mode). We evaluate these modes objectively, capturing from 30 subjects the number of gestures and employed time to reach three specific poses. Likewise, we collected subjective questionnaires to compare the control methods and preferences. Our findings suggest that both methods employ similar gestures, but coded control takes less time with higher variations among ages. Moreover, subjects’ preferences indicate a slight inclination towards the open mode. Finally, it is mandatory to explore different difficulties in the tasks and increase the population to have a more general understanding of the preferences and performance. © 2022, Springer Nature Switzerland AG.
dc.format13
dc.format.mediumRecurso electrónico
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherSpringer Science and Business Media Deutschland GmbH
dc.rights.uriAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
dc.sourceCommunications in Computer and Information Science
dc.sourceCommun. Comput. Info. Sci.
dc.sourceScopus
dc.titleControlling the UR3 Robotic Arm Using a Leap Motion: A Comparative Study
datacite.contributorMariana University, Pasto, Colombia
datacite.contributorUniversity Institution of Envigado, Envigado, Colombia
datacite.contributorFederal University Rio Grande do Sul, Porto Alegre, Brazil
datacite.contributorUniversity of Oslo, Oslo, Norway
datacite.contributorLopez D.A., Mariana University, Pasto, Colombia
datacite.contributorLopez M.A., Mariana University, Pasto, Colombia
datacite.contributorMuñoz D.S., Mariana University, Pasto, Colombia
datacite.contributorSanta J.A., Mariana University, Pasto, Colombia
datacite.contributorGomez D.F., Mariana University, Pasto, Colombia
datacite.contributorBarone D., Federal University Rio Grande do Sul, Porto Alegre, Brazil
datacite.contributorTorresen J., University of Oslo, Oslo, Norway
datacite.contributorSalas J.A.R., University Institution of Envigado, Envigado, Colombia
datacite.contributor3rd Latin American Workshop on Computational Neuroscience, LAWCN 2021
datacite.rightshttp://purl.org/coar/access_right/c_abf2
oaire.resourcetypehttp://purl.org/coar/resource_type/c_c94f
oaire.versionhttp://purl.org/coar/version/c_ab4af688f83e57aa
dc.contributor.contactpersonJ.A.R. Salas
dc.contributor.contactpersonUniversity Institution of Envigado, Envigado, Colombia
dc.contributor.contactpersonemail: jarsalas@inf.ufrgs.br
dc.identifier.doi10.1007/978-3-031-08443-0_5
dc.identifier.instnameUniversidad Mariana
dc.identifier.reponameRepositorio Clara de Asis
dc.identifier.urlhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85135092156&doi=10.1007%2f978-3-031-08443-0_5&partnerID=40&md5=51301dfe62fca90318243af8320f1a3a
dc.relation.citationendpage77
dc.relation.citationstartpage64
dc.relation.citationvolume1519 CCIS
dc.relation.conferencedate8 December 2021 through 10 December 2021
dc.relation.conferenceplaceVirtual, Online
dc.relation.iscitedby0
dc.relation.referencesRobot Manipulators and Control Systems, pp. 35-107, (2007)
dc.relation.referencesAhmed S., Popov V., Topalov A., Shakev N., Hand Gesture Based Concept of Human-Mobile Robot Interaction with Leap Motion Sensor. Ifac-Papersonline, (2019)
dc.relation.referencesBassily D., Georgoulas C., Guettler J., Linner T., Bock T., Intuitive and adaptive robotic arm manipulation using the leap motion controller, Isr/Robotik 2014
dc.relation.references41St International Symposium on Robotics, pp. 1-7, (2014)
dc.relation.referencesBen-Ari M., Mondada F., Elements of Robotics, (2018)
dc.relation.referencesChen C., Chen L., Zhou X., Yan W., Controlling a robot using leap motion, 2017 2Nd International Conference on Robotics and Automation Engineering (ICRAE), pp. 48-51, (2017)
dc.relation.referencesChen S., Ma H., Yang C., Fu M., Hand Gesture Based Robot Control System Using Leap Motion, 9244, pp. 581-591
dc.relation.referencesChoi H., Et al., On the Use of Simulation in Robotics: Opportunities, Challenges, and Suggestions for Moving Forward. In: Proceedings of the National Academy of Sciences, (2021)
dc.relation.referencesFujii K., Gras G., Salerno A., Yang G.Z., Gaze Gesture Based Human Robot Interaction for Laparoscopic Surgery. Med. Image Analhttps://Doi. Org/10.1016/J.Media.2017.11.011
dc.relation.referencesGalvan-Ruiz J., Travieso-Gonzalez C.M., Tejera-Fettmilch A., Pinan-Roescher A., Esteban-Hernandez L., Dominguez-Quintana L., Perspective and Evolution of Gesture Recognition for Sign Language: A Review. Sensors
dc.relation.referencesHaseeb M.A., Kyrarini M., Jiang S., Ristic-Durrant D., Graser A., Head Gesture-Based Control for Assistive Robots. In: Proceedings of the 11Th Pervasive Technologies Related to Assistive Environments Conference, pp. 379-383
dc.relation.referencesMarin G., Dominio F., Zanuttigh P., Hand gesture recognition with leap motion and kinect devices, 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565-1569, (2014)
dc.relation.referencesNeto P., Simao M., Mendes N., Safeea M., Gesture-based human-robot interaction for human assistance in manufacturing, Int. J. Adv. Manuf. Technol., 101, 1, pp. 119-135, (2019)
dc.relation.referencesPerumal S.K.J., Ganesan S., Physical interaction and control of robotic systems using hardware-in-the-loop simulation, Becoming Human with Humanoid, Chap. 6. Intechopen, (2020)
dc.relation.referencesPititeeraphab Y., Choitkunnan P., Thongpance N., Kullathum K., Pintavirooj C., Robot-arm control system using leap motion controller, 2016 International Conference on Biomedical Engineering (BME-HUST), pp. 109-112, (2016)
dc.relation.referencesRobots U., Universal Robot Ur3e
dc.relation.referencesUltraleap: Leap Motion Controller
dc.relation.referencesVysocky A., Et al., Analysis of Precision and Stability of Hand Tracking with Leap Motion Sensor. Sensors, 20, 15, (2020)
dc.relation.referencesWeichert F., Bachmann D., Rudak B., Fisseler D., Analysis of the accuracy and robustness of the leap motion controller, Sensors, 13, 5, pp. 6380-6393, (2013)
dc.rights.accessrightsinfo:eu-repo/semantics/openAccess
dc.subject.keywordsHuman-computer interaction
dc.subject.keywordsLeap-motion
dc.subject.keywordsMachine learning
dc.subject.keywordsRobotic arm
dc.subject.keywordsUR3
dc.subject.keywordsHuman computer interaction
dc.subject.keywordsHuman robot interaction
dc.subject.keywordsMachine learning
dc.subject.keywordsManipulators
dc.subject.keywordsSurveys
dc.subject.keywordsAngle joints
dc.subject.keywordsComparatives studies
dc.subject.keywordsInteraction modes
dc.subject.keywordsIntuitive interfaces
dc.subject.keywordsLeap-motion
dc.subject.keywordsMachine-learning
dc.subject.keywordsOpen modes
dc.subject.keywordsRobotic manipulators
dc.subject.keywordsRobotic systems
dc.subject.keywordsUR3
dc.subject.keywordsRobotic arms
dc.type.driverinfo:eu-repo/semantics/conferenceObject
dc.type.hasversioninfo:eu-repo/semantics/acceptedVersion
dc.type.redcolhttp://purl.org/redcol/resource_type/ARTDATA
dc.type.spaContribución a congreso / Conferencia


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem