• IEEE.org
  • IEEE CS Standards
  • Career Center
  • About Us
  • Subscribe to Newsletter

0

IEEE-CS_LogoTM-orange
  • MEMBERSHIP
  • CONFERENCES
  • PUBLICATIONS
  • EDUCATION & CAREER
  • VOLUNTEER
  • ABOUT
  • Join Us
IEEE-CS_LogoTM-orange

0

IEEE Computer Society Logo
Sign up for our newsletter
IEEE COMPUTER SOCIETY
About UsBoard of GovernorsNewslettersPress RoomIEEE Support CenterContact Us
COMPUTING RESOURCES
Career CenterCourses & CertificationsWebinarsPodcastsTech NewsMembership
BUSINESS SOLUTIONS
Corporate PartnershipsConference Sponsorships & ExhibitsAdvertisingRecruitingDigital Library Institutional Subscriptions
DIGITAL LIBRARY
MagazinesJournalsConference ProceedingsVideo LibraryLibrarian Resources
COMMUNITY RESOURCES
GovernanceConference OrganizersAuthorsChaptersCommunities
POLICIES
PrivacyAccessibility StatementIEEE Nondiscrimination PolicyIEEE Ethics ReportingXML Sitemap

Copyright 2025 IEEE - All rights reserved. A public charity, IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity.

  • Home
  • /Digital Library
  • /Journals
  • /Tsc
  • Home
  • / ...
  • /Journals
  • /Tsc

Call for Papers: Special Issue on Large Language Models in Service-Oriented Ecosystems Design: Advances and Applications

IEEE Transactions on Services Computing seeks submissions for this upcoming special issue.

Submission Deadline: 31 October 2026

Publication Date:  Mid 2027


During the latest years, Large Language Models (LLMs) have emerged as a transformative technology opening new frontiers and opportunities across many areas of ICT, including complex systems design and engineering.

One of the most promising applications lies in service-oriented ecosystems, where LLMs can enhance design processes, support service identification, streamline service discovery, optimize composition mechanisms, and, more generally, enable intelligent automation within Service-Oriented Architectures (SOAs), thus representing a significant - yet still largely underexplored - opportunity for research and practice.

In this context, combining LLMs with the modular and interoperable principles of SOAs could open new research directions for designing and developing adaptive, context-aware, and self-optimizing service ecosystems. This integration may enable systems to dynamically identify, discover, and compose services, reconfigure based on user needs, and continuously improve through natural language interactions.

More broadly, the convergence of LLMs and SOAs denotes a potential paradigm shift in how distributed systems are conceptualized, designed, and implemented. It could lead to more flexible, responsive, and intelligent software architectures that are better aligned with evolving business goals and user requirements.

This Special Issue aims at bringing together cutting-edge research that explores the intersection of LLMs and service-oriented computing, showcasing both theoretical advancements and practical implementations that highlight how LLMs can improve efficiency, accuracy, and scalability in service-oriented solutions design.

In this respect, we welcome submissions that provide new insights into the application of LLMs in architectural decision-making, interoperability, and intelligent service identification, discovery and composition.

The call for this Special Issue is open to all researchers and practitioners, ensuring a broad and diverse range of contributions from academia and industry.

Topics of interest

We invite original research articles, review papers and case study papers on topics including, but not limited to:

  • LLMs for Improving Efficiency, Accuracy, and Scalability of Service-Oriented Architectures
  • LLM-Driven Approaches for Intelligent Service Identification, Discovery, Orchestration, and Composition
  • LLM-Assisted Design of Service-Based Software Architectural Solutions
  • LLM-Powered Service Interoperability and Integration Solutions
  • Case Studies, Applications, and Real-World Implementations of LLMs in Service-Oriented Computing
  • Assessing the Impact of LLMs on Reliability, Maintainability, and Performance in Service-Oriented Architectures

Important Dates

  • Manuscript Submission Deadline: October 31, 2026
  • First Round Notification: March 1, 2027
  • Revised Manuscript Due: May 1, 2027
  • Final Decision Notification: June 1, 2027
  • Final Manuscript Submission Due: June 15, 2027
  • Expected Publication: Mid 2027

Guest Editors

  • Prof. Boualem Benatallah, Dublin City University, Insight SFI Research Centre on Data Analytics, Ireland
  • Dr. Massimiliano Garda, University of Brescia, Department of Information Engineering, Italy
  • Dr. Ada Bagozi, University of Brescia, Department of Information Engineering, Italy
  • Dr. Ilche Georgievski, University of Stuttgart, Department of Service Computing, Germany
  • Distinguished Prof. Michael Sheng, School of Computing, Macquarie University, Australia
  • Prof. Yingjie Wang, School of Computer and Control Engineering, Yantai University, China

Submission Guidelines

For author information and guidelines on submission criteria, please visit Author Resources . Authors should submit original manuscripts not exceeding 14 pages following IEEE Transactions on Services Computing guidelines. All submissions must be made through the IEEE Author Portal. Please select "Special Issue on Large Language Models in Service-Oriented Ecosystems Design: Advances and Applications" during submission. Manuscripts must not be published or under review elsewhere.This Special Issue accepts extended versions of papers from conferences or workshops, under certain restrictions: the extended paper must present 30% additional core contributions (i.e. not limited to extending related works and background information sections) and must have a similarity score lower than 50% relative to the proceedings version. The proceedings paper must be provided at the time of submission and a description of the extension must be included in the cover letter.

In addition to submitting your paper to IEEE Transactions on Services Computing, you are also encouraged to upload the data related to your paper to IEEE DataPort. IEEE DataPort is IEEE's data platform that supports the storage and publishing of datasets while also providing access to thousands of research datasets. Uploading your dataset to IEEE DataPort will strengthen your paper and will support research reproducibility. Your paper and the dataset can be linked, providing a good opportunity for you to increase the number of citations you receive. Data can be uploaded to IEEE DataPort prior to submitting your paper or concurrent with the paper submission.

LATEST NEWS
LinkedIn Profile Template
LinkedIn Profile Template
Quantum Insider Session Series: Choosing the Right Time and Steps for Start Working with Quantum Tech
Quantum Insider Session Series: Choosing the Right Time and Steps for Start Working with Quantum Tech
Igniting Young Minds: The Impact of IEEE CS Juniors STEMpire on Karnataka, Indian Students
Igniting Young Minds: The Impact of IEEE CS Juniors STEMpire on Karnataka, Indian Students
Monitoring LLM Safety with BERTopic: Clustering Failure Modes for Actionable Insights
Monitoring LLM Safety with BERTopic: Clustering Failure Modes for Actionable Insights
CS Juniors: ChiTech Discovery Days
CS Juniors: ChiTech Discovery Days
Read Next

LinkedIn Profile Template

Quantum Insider Session Series: Choosing the Right Time and Steps for Start Working with Quantum Tech

Igniting Young Minds: The Impact of IEEE CS Juniors STEMpire on Karnataka, Indian Students

Monitoring LLM Safety with BERTopic: Clustering Failure Modes for Actionable Insights

CS Juniors: ChiTech Discovery Days

CV Template

A History of Rendering the Future with Computer Graphics & Applications

AI Assisted Identity Threat Detection and Zero Trust Access Enforcement

Get the latest news and technology trends for computing professionals with ComputingEdge
Sign up for our newsletter