Feel the Point Clouds: Traversability Prediction and Tactile Terrain
Detection Information for an Improved Human-Robot Interaction
R. Edlinger, and A. Nüchter. Proceedings of the 32st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), page 1121--1128. Busan, Korea, (August 2023)
DOI: 10.1109/RO-MAN57019.2023.10309349
Abstract
The field of human-robot interaction has been rapidly
advancing in recent years, as robots are
increasingly being integrated into various aspects
of human life. However, for robots to effectively
collaborate with humans, it is crucial that they
have a deep understanding of the environment in
which they operate. In particular, the ability to
predict traversability and detect tactile
information is crucial for enhancing the safety and
efficiency of human-robot interactions. To address
this challenge, this paper proposes a method called
”Feel the Point Clouds” that use point clouds to
predict traversability and detect tactile terrain
information for a tracked rescue robot. This
information can be used to adjust the robot’s
behavior and movements in real-time, allowing it to
interact with the environment in a more intuitive
and safe manner. The experimental results of the
proposed method are evaluated in various scenarios
and demonstrate its effectiveness in improving
human-robot interaction and visualization for a more
accurate and intuitive understanding of the
environment.
%0 Conference Paper
%1 ROMAN2023
%A Edlinger, R.
%A Nüchter, A.
%B Proceedings of the 32st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)
%C Busan, Korea
%D 2023
%K imported
%P 1121--1128
%R 10.1109/RO-MAN57019.2023.10309349
%T Feel the Point Clouds: Traversability Prediction and Tactile Terrain
Detection Information for an Improved Human-Robot Interaction
%U https://robotik.informatik.uni-wuerzburg.de/telematics/download/roman2023.pdf
%X The field of human-robot interaction has been rapidly
advancing in recent years, as robots are
increasingly being integrated into various aspects
of human life. However, for robots to effectively
collaborate with humans, it is crucial that they
have a deep understanding of the environment in
which they operate. In particular, the ability to
predict traversability and detect tactile
information is crucial for enhancing the safety and
efficiency of human-robot interactions. To address
this challenge, this paper proposes a method called
”Feel the Point Clouds” that use point clouds to
predict traversability and detect tactile terrain
information for a tracked rescue robot. This
information can be used to adjust the robot’s
behavior and movements in real-time, allowing it to
interact with the environment in a more intuitive
and safe manner. The experimental results of the
proposed method are evaluated in various scenarios
and demonstrate its effectiveness in improving
human-robot interaction and visualization for a more
accurate and intuitive understanding of the
environment.
@inproceedings{ROMAN2023,
abstract = {The field of human-robot interaction has been rapidly
advancing in recent years, as robots are
increasingly being integrated into various aspects
of human life. However, for robots to effectively
collaborate with humans, it is crucial that they
have a deep understanding of the environment in
which they operate. In particular, the ability to
predict traversability and detect tactile
information is crucial for enhancing the safety and
efficiency of human-robot interactions. To address
this challenge, this paper proposes a method called
”Feel the Point Clouds” that use point clouds to
predict traversability and detect tactile terrain
information for a tracked rescue robot. This
information can be used to adjust the robot’s
behavior and movements in real-time, allowing it to
interact with the environment in a more intuitive
and safe manner. The experimental results of the
proposed method are evaluated in various scenarios
and demonstrate its effectiveness in improving
human-robot interaction and visualization for a more
accurate and intuitive understanding of the
environment.},
added-at = {2023-12-03T22:21:07.000+0100},
address = {Busan, Korea},
author = {Edlinger, R. and N{\"u}chter, A.},
biburl = {https://www.bibsonomy.org/bibtex/249d33570bec8667af6b6eea101812bd9/nuechter76},
booktitle = {Proceedings of the 32st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)},
doi = {10.1109/RO-MAN57019.2023.10309349},
interhash = {32b2444ce2952662a826c47282392ed6},
intrahash = {49d33570bec8667af6b6eea101812bd9},
keywords = {imported},
month = {August},
pages = {1121--1128},
timestamp = {2023-12-03T22:21:12.000+0100},
title = {Feel the Point Clouds: Traversability Prediction and Tactile Terrain
Detection Information for an Improved Human-Robot Interaction},
url = {https://robotik.informatik.uni-wuerzburg.de/telematics/download/roman2023.pdf},
year = 2023
}