Preprint has been published in a journal as an article
DOI of the published article https://doi.org/10.1109/ACCESS.2018.2853560
Preprint / Version 1

Common Metrics to Benchmark Human-Machine Teams (HMT): A Review

##article.authors##

  • Praveen Damacharla KINETICAI INC
  • Ahmad Javaid
  • Jennie J. Gallimore
  • Vijay K. Devabhaktuni

DOI:

https://doi.org/10.31224/2791

Keywords:

Human Factors, Artificial Intelligence (AI), Robotics, human-computer interaction, Human Robot Interaction, Software metrics, Human-Machine Teaming, Performance metrics, Metrics, Benchmarking

Abstract

A significant amount of work is invested in human-machine teaming (HMT) across multiple fields. Accurately and effectively measuring system performance of an HMT is crucial for moving the design of these systems forward. Metrics are the enabling tools to devise a benchmark in any system and serve as an evaluation platform for assessing the performance, along with the verification and validation, of a system. Currently, there is no agreed-upon set of benchmark metrics for developing HMT systems. Therefore, identification and classification of common metrics are imperative to create a benchmark in the HMT field. The key focus of this review is to conduct a detailed survey aimed at identification of metrics employed in different segments of HMT and to determine the common metrics that can be used in the future to benchmark HMTs. We have organized this review as follows: identification of metrics used in HMTs until now, and classification based on functionality and measuring techniques. Additionally, we have also attempted to analyze all the identified metrics in detail while classifying them as theoretical, applied, real-time, non-real-time, measurable, and observable metrics. We conclude this review with a detailed analysis of the identified common metrics along with their usage to benchmark HMTs.

Downloads

Download data is not yet available.

Downloads

Posted

2023-01-20