Mostrar el registro sencillo del ítem
Controlling the UR3 Robotic Arm Using a Leap Motion: A Comparative Study
dc.rights.license | http://creativecommons.org/licenses/by-nc-nd/4.0/ | |
dc.contributor.author | Lopez D.A. | |
dc.contributor.author | Lopez M.A. | |
dc.contributor.author | Muñoz D.S. | |
dc.contributor.author | Santa J.A. | |
dc.contributor.author | Gomez D.F. | |
dc.contributor.author | Barone D. | |
dc.contributor.author | Torresen J. | |
dc.contributor.author | Salas J.A.R. | |
dc.contributor.editor | Ribeiro P.R. | |
dc.contributor.editor | Cota V.R. | |
dc.contributor.editor | Barone D.A. | |
dc.contributor.editor | de Oliveira A.C. | |
dc.contributor.other | 3rd Latin American Workshop on Computational Neuroscience, LAWCN 2021 | |
dc.date.accessioned | 2024-12-02T20:16:05Z | |
dc.date.available | 2024-12-02T20:16:05Z | |
dc.date.issued | 2022 | |
dc.identifier.isbn | 978-303108442-3 | |
dc.identifier.issn | 18650929 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14112/29018 | |
dc.description.abstract | Each day, robotic systems are becoming more familiar and common in different contexts such as factories, hospitals, houses, and restaurants, creating a necessity of seeking for affordable and intuitive interface for effective and engaging communication with humans. Likewise, innovative devices that offer alternative methods of interacting with machines allow us to create new interfaces, improving the learning training and motion application. Thus, this paper compares two interaction modes using leap motion to control a robotic manipulator (UR3) simulator. Users can control the robot through numerical gestures to set up the angle joints (coded mode) or counter/clockwise gestures to increase or decrease the angle values (open mode). We evaluate these modes objectively, capturing from 30 subjects the number of gestures and employed time to reach three specific poses. Likewise, we collected subjective questionnaires to compare the control methods and preferences. Our findings suggest that both methods employ similar gestures, but coded control takes less time with higher variations among ages. Moreover, subjects’ preferences indicate a slight inclination towards the open mode. Finally, it is mandatory to explore different difficulties in the tasks and increase the population to have a more general understanding of the preferences and performance. © 2022, Springer Nature Switzerland AG. | |
dc.format | 13 | |
dc.format.medium | Recurso electrónico | |
dc.format.mimetype | application/pdf | |
dc.language.iso | eng | |
dc.publisher | Springer Science and Business Media Deutschland GmbH | |
dc.rights.uri | Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) | |
dc.source | Communications in Computer and Information Science | |
dc.source | Commun. Comput. Info. Sci. | |
dc.source | Scopus | |
dc.title | Controlling the UR3 Robotic Arm Using a Leap Motion: A Comparative Study | |
datacite.contributor | Mariana University, Pasto, Colombia | |
datacite.contributor | University Institution of Envigado, Envigado, Colombia | |
datacite.contributor | Federal University Rio Grande do Sul, Porto Alegre, Brazil | |
datacite.contributor | University of Oslo, Oslo, Norway | |
datacite.contributor | Lopez D.A., Mariana University, Pasto, Colombia | |
datacite.contributor | Lopez M.A., Mariana University, Pasto, Colombia | |
datacite.contributor | Muñoz D.S., Mariana University, Pasto, Colombia | |
datacite.contributor | Santa J.A., Mariana University, Pasto, Colombia | |
datacite.contributor | Gomez D.F., Mariana University, Pasto, Colombia | |
datacite.contributor | Barone D., Federal University Rio Grande do Sul, Porto Alegre, Brazil | |
datacite.contributor | Torresen J., University of Oslo, Oslo, Norway | |
datacite.contributor | Salas J.A.R., University Institution of Envigado, Envigado, Colombia | |
datacite.contributor | 3rd Latin American Workshop on Computational Neuroscience, LAWCN 2021 | |
datacite.rights | http://purl.org/coar/access_right/c_abf2 | |
oaire.resourcetype | http://purl.org/coar/resource_type/c_c94f | |
oaire.version | http://purl.org/coar/version/c_ab4af688f83e57aa | |
dc.contributor.contactperson | J.A.R. Salas | |
dc.contributor.contactperson | University Institution of Envigado, Envigado, Colombia | |
dc.contributor.contactperson | email: jarsalas@inf.ufrgs.br | |
dc.identifier.doi | 10.1007/978-3-031-08443-0_5 | |
dc.identifier.instname | Universidad Mariana | |
dc.identifier.reponame | Repositorio Clara de Asis | |
dc.identifier.url | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85135092156&doi=10.1007%2f978-3-031-08443-0_5&partnerID=40&md5=51301dfe62fca90318243af8320f1a3a | |
dc.relation.citationendpage | 77 | |
dc.relation.citationstartpage | 64 | |
dc.relation.citationvolume | 1519 CCIS | |
dc.relation.conferencedate | 8 December 2021 through 10 December 2021 | |
dc.relation.conferenceplace | Virtual, Online | |
dc.relation.iscitedby | 0 | |
dc.relation.references | Robot Manipulators and Control Systems, pp. 35-107, (2007) | |
dc.relation.references | Ahmed S., Popov V., Topalov A., Shakev N., Hand Gesture Based Concept of Human-Mobile Robot Interaction with Leap Motion Sensor. Ifac-Papersonline, (2019) | |
dc.relation.references | Bassily D., Georgoulas C., Guettler J., Linner T., Bock T., Intuitive and adaptive robotic arm manipulation using the leap motion controller, Isr/Robotik 2014 | |
dc.relation.references | 41St International Symposium on Robotics, pp. 1-7, (2014) | |
dc.relation.references | Ben-Ari M., Mondada F., Elements of Robotics, (2018) | |
dc.relation.references | Chen C., Chen L., Zhou X., Yan W., Controlling a robot using leap motion, 2017 2Nd International Conference on Robotics and Automation Engineering (ICRAE), pp. 48-51, (2017) | |
dc.relation.references | Chen S., Ma H., Yang C., Fu M., Hand Gesture Based Robot Control System Using Leap Motion, 9244, pp. 581-591 | |
dc.relation.references | Choi H., Et al., On the Use of Simulation in Robotics: Opportunities, Challenges, and Suggestions for Moving Forward. In: Proceedings of the National Academy of Sciences, (2021) | |
dc.relation.references | Fujii K., Gras G., Salerno A., Yang G.Z., Gaze Gesture Based Human Robot Interaction for Laparoscopic Surgery. Med. Image Analhttps://Doi. Org/10.1016/J.Media.2017.11.011 | |
dc.relation.references | Galvan-Ruiz J., Travieso-Gonzalez C.M., Tejera-Fettmilch A., Pinan-Roescher A., Esteban-Hernandez L., Dominguez-Quintana L., Perspective and Evolution of Gesture Recognition for Sign Language: A Review. Sensors | |
dc.relation.references | Haseeb M.A., Kyrarini M., Jiang S., Ristic-Durrant D., Graser A., Head Gesture-Based Control for Assistive Robots. In: Proceedings of the 11Th Pervasive Technologies Related to Assistive Environments Conference, pp. 379-383 | |
dc.relation.references | Marin G., Dominio F., Zanuttigh P., Hand gesture recognition with leap motion and kinect devices, 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565-1569, (2014) | |
dc.relation.references | Neto P., Simao M., Mendes N., Safeea M., Gesture-based human-robot interaction for human assistance in manufacturing, Int. J. Adv. Manuf. Technol., 101, 1, pp. 119-135, (2019) | |
dc.relation.references | Perumal S.K.J., Ganesan S., Physical interaction and control of robotic systems using hardware-in-the-loop simulation, Becoming Human with Humanoid, Chap. 6. Intechopen, (2020) | |
dc.relation.references | Pititeeraphab Y., Choitkunnan P., Thongpance N., Kullathum K., Pintavirooj C., Robot-arm control system using leap motion controller, 2016 International Conference on Biomedical Engineering (BME-HUST), pp. 109-112, (2016) | |
dc.relation.references | Robots U., Universal Robot Ur3e | |
dc.relation.references | Ultraleap: Leap Motion Controller | |
dc.relation.references | Vysocky A., Et al., Analysis of Precision and Stability of Hand Tracking with Leap Motion Sensor. Sensors, 20, 15, (2020) | |
dc.relation.references | Weichert F., Bachmann D., Rudak B., Fisseler D., Analysis of the accuracy and robustness of the leap motion controller, Sensors, 13, 5, pp. 6380-6393, (2013) | |
dc.rights.accessrights | info:eu-repo/semantics/openAccess | |
dc.subject.keywords | Human-computer interaction | |
dc.subject.keywords | Leap-motion | |
dc.subject.keywords | Machine learning | |
dc.subject.keywords | Robotic arm | |
dc.subject.keywords | UR3 | |
dc.subject.keywords | Human computer interaction | |
dc.subject.keywords | Human robot interaction | |
dc.subject.keywords | Machine learning | |
dc.subject.keywords | Manipulators | |
dc.subject.keywords | Surveys | |
dc.subject.keywords | Angle joints | |
dc.subject.keywords | Comparatives studies | |
dc.subject.keywords | Interaction modes | |
dc.subject.keywords | Intuitive interfaces | |
dc.subject.keywords | Leap-motion | |
dc.subject.keywords | Machine-learning | |
dc.subject.keywords | Open modes | |
dc.subject.keywords | Robotic manipulators | |
dc.subject.keywords | Robotic systems | |
dc.subject.keywords | UR3 | |
dc.subject.keywords | Robotic arms | |
dc.type.driver | info:eu-repo/semantics/conferenceObject | |
dc.type.hasversion | info:eu-repo/semantics/acceptedVersion | |
dc.type.redcol | http://purl.org/redcol/resource_type/ARTDATA | |
dc.type.spa | Contribución a congreso / Conferencia |
Ficheros en el ítem
Ficheros | Tamaño | Formato | Ver |
---|---|---|---|
No hay ficheros asociados a este ítem. |
Este ítem aparece en la(s) siguiente(s) colección(ones)
-
Artículos Scopus [165]