Call for papers, Explainability in Automated Software Engineering (Ex-ASE)

The increasing automation of software engineering processes—from requirements and design to testing and deployment—has significantly improved efficiency and scalability. However, the growing reliance on AI-based and autonomous techniques often leads to a loss of transparency, resulting in “black-box” systems whose behavior and reasoning are opaque to developers and users. This lack of explainability poses serious risks in high-stakes domains such as autonomous vehicles, digital health, and robotics, where software decisions have direct real-world consequences.

To address these challenges, this Special Issue invites submissions on the emerging field of Explainability in Automated Software Engineering (Ex-ASE). The aim is to promote explainability as a core principle in the development of ethical, transparent, and responsible automated software systems. We welcome contributions presenting novel methods, tools, frameworks, and empirical studies that make explainability a first-class concern in both traditional and autonomous software engineering.

Topics of interest include (but are not limited to):

  • Foundations of Explainability

    • Explainability in requirements engineering, traceability, and software design
    • Theoretical frameworks and models for explainable ASE
    • Legal, ethical, and societal aspects of transparency and accountability
  • Explainable Methods and Techniques

    • Explanations for automated testing, static/dynamic analysis, and program repair
    • Human-in-the-loop explainability for automated development workflows
    • Context-aware and stakeholder-specific explanation generation
    • Explainability in SE4AI and AI4SE approaches
  • Tools and Applications

    • Tool support for explaining model-driven or AI-based automation
    • Explainability in software-dependent domains such as CPS, IoT, robotics, and digital health
    • Industrial case studies and real-world applications
  • Evaluation and Impact

    • Evaluation methods and metrics for explanation quality and impact

Authors should prepare their manuscript according to the Instructions for Authors available from the Journal’s submission guidelines https://link.springer.com/journal/10515/submission-guidelines. Submitted papers should present original, unpublished work, relevant to one of the topics of the special issue. All submitted papers will be evaluated on the basis of relevance, significance of contribution, technical quality, scholarship, and quality of presentation by at least two independent reviewers. It is the policy of the journal that no submission, or substantially overlapping submission, be published or be under review at another journal or conference at any time during the review process.

Please note that the authors of selected papers presented at Ex-ASE 2025 are invited to submit an extended version of their contributions by taking into consideration both the reviewers’ comments on their conference paper, and the feedback received during presentation at the conference. It is worth clarifying that the extended version is expected to contain a substantial scientific contribution, e.g., in the form of new algorithms, experiments or qualitative/quantitative comparisons, and that neither verbatim transfer of large parts of the conference paper nor reproduction of already published figures will be tolerated.

The extended versions of Ex-ASE 2025 papers will undergo the standard, rigorous journal review process and be accepted only if well-suited to the topic of this special issue and meeting the scientific level of the journal. Final decisions on all papers are made by the Editor in Chief.

Deadline

Submission deadline: 30 June 2026

How to Submit

Opening date for submissions: 15 January 2026

Please submit via Springer.

Editors

  • Mersedes Sadegh, University of Cologne, Germany
  • Livia Lestingi, Politecnico di Milano, Italy
  • Alireza Javadian Sabet, University of Pittsburg, USA
  • Marjan Hosseini, University of Connecticut, USA