ScienceGuardians

ScienceGuardians

Did You Know?

ScienceGuardians hosts academic institutions too

Generative AI and human–robot interaction: implications and future agenda for business, society and ethics

Authors: Bojan Obrenovic,Xiao Gu,Guoyu Wang,Danijela Godinic,Ilimdorjon Jakhongirov
Journal: AI
Publisher: Springer Science and Business Media LLC
Publish date: 2024-3-15
ISSN: 0951-5666 DOI: 10.1007/s00146-024-01889-0
View on Publisher's Website
Up
0
Down
::

I’m looking at Tables 2 and 3, and they’re basically empty – just institution names and author names with almost no actual numbers. Table 1 has relevance scores but no methodology explaining how these were calculated. How can this be considered a valid scientometric analysis when the core data that’s supposed to support your conclusions about research trends is missing? Without this data, isn’t this just speculation dressed up as science?

You claim generative AI like ChatGPT “blurs the boundaries between humans and robots,” but earlier you define HRI as involving physical robots interacting with humans. ChatGPT is text-based software running on servers – there’s no physical embodiment, no sensors, no actuators. How can you justify analyzing a purely digital text generator using frameworks designed for physical robots without acknowledging this as a fundamental category error? Doesn’t this stretch the concept of “interaction” so thin it loses all meaning?

Your paper promises to explain “technical aspects of generative AI that enhance its effectiveness… compared to traditional rule-based systems” and how it “can be optimized for specific HRI applications.” But I can’t find any experimental results, benchmark comparisons, or optimization frameworks anywhere. You list features like “conversational communication” and “adaptive learning” as advantages, but without empirical evidence showing these actually lead to better outcomes in real HRI contexts, aren’t you just repeating marketing materials from OpenAI?

  • You must be logged in to reply to this topic.