Security, safety, and reliability are important attributes for mission- and life-critical systems. The attribute security attempts to prevent systems from attack, and safety attempts to ensure systems do not end at an undesired state and cause unacceptable consequences. Reliability is the probability of failure-free system operation for a given period of time in a specified environment. As system complexity continues to grow, ensuring security, safety, and reliability is imperative. With the increased use of artificial intelligence (AI) and machine learning (ML) techniques, it is important to ensure and assess these three attributes particularly when they are built into mission- and life-critical systems.
This Computer special issue solicits original work that explores how to certify and regulate AI and ML-based techniques used in mission- and life-critical systems. All submitted papers are to focus on this theme using state-of-the-art technology with consolidated and thoroughly evaluated application-oriented research results from various academic and industry viewpoints.
Topics of interest include, but are not limited to, the following that focus on certification and regulation:
All the submissions should follow the template provided by the Computer magazine and should consist of the following:
The manuscripts need to be submitted online at ScholarOne. Select this special-issue option in Step 1 of the submission process to ensure that the article is reviewed for this special issue.
Contact the guest editors at co9-24@computer.org.