It is essential to deliver learning-based artificial intelligence (AI) capabilities through API service endpoints at the edge under the as-a-service model to meet the growing demand for integrating big data, Internet of Things (IoT), and cloud computing capabilities. The edge AI services expedite making data-driven decisions (e.g., optimizing control, coordination, investment, medical services allocation, etc.). A sample research challenge addresses learning and/or inferencing efficiency of the edge AI service (or microservice) under development that meets the requirements for minimum level of quality of service (QoS), processing power, energy consumption, robustness, extensibility, and/or DevOps. Policy-based federated learning services at edge are typical edge AI use cases in which data processing and management constraints need to be considered in terms of data origin, ownership, heterogeneity, security, and privacy. 5G/6G edge nodes provide unprecedented high QoS wireless operating environments for edge AI services and related applications.
This special issue aims at promoting high-quality research on recent advances in edge AI-as-a-service and at inspiring related research efforts. Topics of interest include, but are not limited to, the following:
Please select “SI on Edge AI-as-a-Service” when submitting your manuscripts in the online system at https://mc.manuscriptcentral.com/tsc-cs. Please read the author guidelines before submitting. Every manuscript should be no more than 14 pages. Submitted manuscripts should not have been previously published nor be currently under review for publication elsewhere. Moreover, they should provide a minimum of 30% original technical contributions in comparison to previous publications.