sensors-logo

Journal Browser

Journal Browser

State-of-Art in Sensors for Robotic Applications

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (30 June 2022) | Viewed by 8275

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Ingegneria, Università degli Studi della Campania “Luigi Vanvitelli”, Via Roma, 29, 81031 Aversa, CE, Italy
Interests: sensors for robotics applications; control; human–robot interaction
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Electrical, Electronic and Information Engineering, Università di Bologna—DEI, Viale del Risorgimento 2, 40136 Bologna, Italy
Interests: robotic manipulation; robotic hands; design and control of robotic manipulators; underwater robotics; force/tactile sensors; compliant actuation; mobile manipulation; manipulation of deformable objects
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In many robotic applications, the successful execution of a task strongly depends on the knowledge of environment features. For example, in a grasping and manipulation task, the objects’ geometrical and physical characteristics produce fundamental information for a possible automatization. For the implementation of a human–robot collaborative task, the interaction is possible only through suitable sensing systems. To address these issues, but also for many other cases, robotic systems are frequently equipped with a high number of sensing devices, both standard and innovative, which measurements are used as inputs both for model-based control systems and machine learning systems. The latter systems are developed more and more, in order to increase the autonomy level of the robots. This Special Issue will present how sensing systems integrated in robotic systems can be used with recent techniques of sensor fusion, in order to allow the automatization of new challenging tasks.

This Special Issue invites (but is not limited to) contributions on the following topics:

  • Sensor technologies for robotic applications
  • Sensor modeling for robotic applications
  • Data interpretation for robotic applications
  • Grasping and manipulation
  • Dexterous manipulation
  • Object features recognition for robotic applications
  • Physical human–robot interactions
  • Human–machine interfaces

You may choose our Joint Special Issue in Machines.

Prof. Dr. Salvatore Pirozzi
Dr. Gianluca Palli
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Sensor technologies for robotic applications
  • Sensor modelling for robotic applications
  • Data interpretation for robotic applications
  • Grasping and manipulation
  • Dexterous manipulation
  • Object features recognition for robotic applications
  • Physical human robot interaction
  • Human machine interfaces

Related Special Issue

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

31 pages, 8854 KiB  
Article
Efficient Obstacle Detection and Tracking Using RGB-D Sensor Data in Dynamic Environments for Robotic Applications
by Arindam Saha, Bibhas Chandra Dhara, Saiyed Umer, Kulakov Yurii, Jazem Mutared Alanazi and Ahmad Ali AlZubi
Sensors 2022, 22(17), 6537; https://doi.org/10.3390/s22176537 - 30 Aug 2022
Cited by 8 | Viewed by 3368
Abstract
Obstacle detection is an essential task for the autonomous navigation by robots. The task becomes more complex in a dynamic and cluttered environment. In this context, the RGB-D camera sensor is one of the most common devices that provides a quick and reasonable [...] Read more.
Obstacle detection is an essential task for the autonomous navigation by robots. The task becomes more complex in a dynamic and cluttered environment. In this context, the RGB-D camera sensor is one of the most common devices that provides a quick and reasonable estimation of the environment in the form of RGB and depth images. This work proposes an efficient obstacle detection and tracking method using depth images to facilitate quick dynamic obstacle detection. To achieve early detection of dynamic obstacles and stable estimation of their states, as in previous methods, we applied a u-depth map for obstacle detection. Unlike existing methods, the present method provides dynamic thresholding facilities on the u-depth map to detect obstacles more accurately. Here, we propose a restricted v-depth map technique, using post-processing after the u-depth map processing to obtain a better prediction of the obstacle dimension. We also propose a new algorithm to track obstacles until they are within the field of view (FOV). We evaluate the performance of the proposed system on different kinds of data sets. The proposed method outperformed the vision-based state-of-the-art (SoA) methods in terms of state estimation of dynamic obstacles and execution time. Full article
(This article belongs to the Special Issue State-of-Art in Sensors for Robotic Applications)
Show Figures

Figure 1

25 pages, 8987 KiB  
Article
Collision Detection of a HEXA Parallel Robot Based on Dynamic Model and a Multi-Dual Depth Camera System
by Xuan-Bach Hoang, Phu-Cuong Pham and Yong-Lin Kuo
Sensors 2022, 22(15), 5923; https://doi.org/10.3390/s22155923 - 08 Aug 2022
Cited by 7 | Viewed by 1837
Abstract
This paper introduces a Hexa parallel robot and obstacle collision detection method based on dynamic modeling and a computer vision system. The processes to deal with the collision issues refer to collision detection, collision isolation, and collision identification applied to the Hexa robot, [...] Read more.
This paper introduces a Hexa parallel robot and obstacle collision detection method based on dynamic modeling and a computer vision system. The processes to deal with the collision issues refer to collision detection, collision isolation, and collision identification applied to the Hexa robot, respectively, in this paper. Initially, the configuration, kinematic and dynamic characteristics during movement trajectories of the Hexa parallel robot are analyzed to perform the knowledge extraction for the method. Next, a virtual force sensor is presented to estimate the collision detection signal created as a combination of the solution to the inverse dynamics and a low-pass filter. Then, a vision system consisting of dual-depth cameras is designed for obstacle isolation and determining the contact point location at the end-effector, an arm, and a rod of the Hexa robot. Finally, a recursive Newton-Euler algorithm is applied to compute contact forces caused by collision cases with the real-Hexa robot. Based on the experimental results, the force identification is compared to sensor forces for the performance evaluation of the proposed collision detection method. Full article
(This article belongs to the Special Issue State-of-Art in Sensors for Robotic Applications)
Show Figures

Figure 1

15 pages, 9375 KiB  
Article
Tell Me, What Do You See?—Interpretable Classification of Wiring Harness Branches with Deep Neural Networks
by Piotr Kicki, Michał Bednarek, Paweł Lembicz, Grzegorz Mierzwiak, Amadeusz Szymko, Marek Kraft and Krzysztof Walas
Sensors 2021, 21(13), 4327; https://doi.org/10.3390/s21134327 - 24 Jun 2021
Cited by 7 | Viewed by 2086
Abstract
In the context of the robotisation of industrial operations related to manipulating deformable linear objects, there is a need for sophisticated machine vision systems, which could classify the wiring harness branches and provide information on where to put them in the assembly process. [...] Read more.
In the context of the robotisation of industrial operations related to manipulating deformable linear objects, there is a need for sophisticated machine vision systems, which could classify the wiring harness branches and provide information on where to put them in the assembly process. However, industrial applications require the interpretability of the machine learning system predictions, as the user wants to know the underlying reason for the decision made by the system. We propose several different neural network architectures that are tested on our novel dataset to address this issue. We conducted various experiments to assess the influence of modality, data fusion type, and the impact of data augmentation and pretraining. The outcome of the network is evaluated in terms of the performance and is also equipped with saliency maps, which allow the user to gain in-depth insight into the classifier’s operation, including a way of explaining the responses of the deep neural network and making system predictions interpretable by humans. Full article
(This article belongs to the Special Issue State-of-Art in Sensors for Robotic Applications)
Show Figures

Figure 1

Back to TopTop