TY - CHAP
T1 - An Integrated Framework to Evaluate Information Systems Performance in High-Risk Settings
T2 - Experiences from the iTRACK Project
AU - Abdelgawad, Ahmed A.
AU - Comes, Tina
PY - 2023
Y1 - 2023
N2 - Evaluation and testing are significant steps in developing any information system. More attention must be devoted to these steps if the system is to be used in high-risk contexts, such as the response to conflict disasters. Several testing methodologies are designed to guarantee that software fulfills technology requirements; others will assure usability and usefulness. However, there is currently no integrated evaluation framework with agreed standards that bring together the three elements: technology requirements, usability, and usefulness. This gap constitutes a barrier to innovation and imposes risks to responders or affected populations if the technology is introduced without proper testing. This chapter aims to close this gap. Based on a review of evaluation methods and measurement metrics for information systems, we designed an integrated evaluation framework including standard metrics for code quality testing, usability methods, subjective usefulness questionnaires, and key performance indicators. We developed and implemented a reporting and evaluation system that demonstrates our evaluation framework within the context of the EU H2020 project iTRACK. iTRACK developed an integrated system for the safety and security of humanitarian missions. We demonstrate how our approach allows measuring the quality and usefulness of the iTRACK integrated system.
AB - Evaluation and testing are significant steps in developing any information system. More attention must be devoted to these steps if the system is to be used in high-risk contexts, such as the response to conflict disasters. Several testing methodologies are designed to guarantee that software fulfills technology requirements; others will assure usability and usefulness. However, there is currently no integrated evaluation framework with agreed standards that bring together the three elements: technology requirements, usability, and usefulness. This gap constitutes a barrier to innovation and imposes risks to responders or affected populations if the technology is introduced without proper testing. This chapter aims to close this gap. Based on a review of evaluation methods and measurement metrics for information systems, we designed an integrated evaluation framework including standard metrics for code quality testing, usability methods, subjective usefulness questionnaires, and key performance indicators. We developed and implemented a reporting and evaluation system that demonstrates our evaluation framework within the context of the EU H2020 project iTRACK. iTRACK developed an integrated system for the safety and security of humanitarian missions. We demonstrate how our approach allows measuring the quality and usefulness of the iTRACK integrated system.
KW - Evaluation framework
KW - High risk
KW - Humanitarian disaster
KW - Requirements engineering
KW - Software quality testing
KW - Usability
KW - Usefulness
UR - http://www.scopus.com/inward/record.url?scp=85152082307&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-20939-0_9
DO - 10.1007/978-3-031-20939-0_9
M3 - Chapter
AN - SCOPUS:85152082307
T3 - Public Administration and Information Technology
SP - 147
EP - 180
BT - Public Administration and Information Technology
PB - Springer
ER -