- Submission Deadline: 26 May 2023
- Review Decision: 18 September 2023
- Issue Publication: January 2024
The scale of software complexity is a near consensus from academics, industries, and governments that the influence is on how we make decisions in software systems, and how software systems interact with humans. Many technologies have been advanced to address the challenges. In this special issue, we aim to explore the opportunities to bring observability and explainability towards the improvement of transparency of complex and large-scale software systems. Observability sustains in-time, continuous and configurable reconstruction of the internal states of software systems at varying degrees of granularity across a system. Explainability mostly focuses on the post hoc approximation of the system’s behavior relating to inputs and chosen outputs. Observability and explainability each line up inimitable views on transparency. Their synergy addresses not only data-driven solutions but also pave a path to achieve broader missions in software systems including but not limited to tracing the causality, accountable decision-making and auditing, automation and self-organization, and collaborative intelligence among human, software system, and large-scale machine learning.
We invite article submissions covering all aspects of observability and explainability for software systems decision making including, but not limited to:
- System and software requirements in relation to observability and explainability
- Software architecture and design techniques in relation to observability and explainability
- Software maintenance and evolution in relation with observability and explainability
- Discovery and classification of uncertainty factors in relation to observability and explainability
- Tools for observability of software systems
- Tracing and logging of software systems
- Data engineering for observability and explainability
- Process, framework, and service of observability and explainability
- Models, algorithms, applications, and user interfaces for observability and explainability
- Human-in-the-loop pipelines and workflows of explainability
- Integrating Observability or explainability to existing systems
- The governance aspects of observability or explainability
- Observability or explainability culture and education
For author information and guidelines on submission criteria, please visit the Software’s Author Information page. Please submit papers through the ScholarOne system, and be sure to select the special issue name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts, to the ScholarOne portal.
Please contact the guest editors at email@example.com.
- Yan Liu, Concordia University, Canada
- Wahab Hamou-Lhadj, Concordia University, Canada
- Jiye Li, Thales Group, Canada
- Qinghua Lu, Data61, Australia