• IEEE.org
  • IEEE CS Standards
  • Career Center
  • About Us
  • Subscribe to Newsletter

0

IEEE
CS Logo
  • MEMBERSHIP
  • CONFERENCES
  • PUBLICATIONS
  • EDUCATION & CAREER
  • VOLUNTEER
  • ABOUT
  • Join Us
CS Logo

0

IEEE Computer Society Logo
Sign up for our newsletter
IEEE COMPUTER SOCIETY
About UsBoard of GovernorsNewslettersPress RoomIEEE Support CenterContact Us
COMPUTING RESOURCES
Career CenterCourses & CertificationsWebinarsPodcastsTech NewsMembership
BUSINESS SOLUTIONS
Corporate PartnershipsConference Sponsorships & ExhibitsAdvertisingRecruitingDigital Library Institutional Subscriptions
DIGITAL LIBRARY
MagazinesJournalsConference ProceedingsVideo LibraryLibrarian Resources
COMMUNITY RESOURCES
GovernanceConference OrganizersAuthorsChaptersCommunities
POLICIES
PrivacyAccessibility StatementIEEE Nondiscrimination PolicyIEEE Ethics ReportingXML Sitemap

Copyright 2025 IEEE - All rights reserved. A public charity, IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity.

  • Home
  • /Digital Library
  • /Journals
  • /Tk
  • Home
  • / ...
  • /Journals
  • /Tk

Call for Papers: Special Issue on Graph Foundation Models: Database and Data Mining Perspectives (GFM)

IEEE Transactions on Knowledge and Data Engineering seeks submissions for upcoming issues.

Submission Deadline: 01 February 2026

Publication Date: Late 2026


Graphs, which encapsulate the complex intercorrelation among objects, are ubiquitous non-Euclidean structures, found in domains ranging from recommender systems and social media analysis to financial technology and drug discovery. With the explosion of data, graphs are becoming increasingly large and complex. Neural graph databases have been introduced to manage large-scale graphs while enabling graph inference with graph neural networks. Recently, foundation models, such as Large Language Models (LLMs), have marked a revolutionary advancement in addressing numerous tasks using universally pretrained models. Graph data and inference tasks are diverse; however, unlike the success in the language and vision domains, foundation models remain in their infancy in the graph domain.

Graph Foundation Models (GFM) refer to a novel family of general-purpose graph models that are pre-trained at scale on diverse graph data, providing new challenges in both graph mining and graph database domains. Recent advances on GFM have explored leveraging LLMs to build GFMs; however, this line of work often struggles with the graph inference, particularly when complex structural patterns are involved. Other efforts design GFM using graph neural networks, yet fundamental challenges hinder their scalability. Key open issues include managing large-scale graphs, enabling distributed training of graph models, improving graph knowledge transferability and accelerating both LLM and graph inference. Addressing these challenges makes the discussions of GFM both urgent and timely.

This special issue aims to bring together researchers and practitioners from academia and industry to present their latest findings related on graph mining, graph databases, and LLMs with particular emphasis on graph foundation models. We invite submissions of papers that address fundamental issues, proposed novel models, or showcase compelling applications that shed light on the next-generation graph engineering paradigm.

Topics

This special issue will cover a wide range of topics on graph foundation models, including but not limited to:

  • Large Language Model and Graphs
  • Clique counting with LLMs and graph reasoning agent
  • Graph tokenization, Mixture-of-Experts, Mixture of Thought prompting
  • Graph post-training, instruction tuning, prompting and in-context reasoning
  • Hybrid GNN–LLM architectures and graph-language co-training
  • Graph in-context learning and graph retrieval-augmented generation
  • Graph Database and Management of Billion-scale Graphs
  • Graph database and neural graph databases
  • Sparsification and sampling methods for graphs
  • Distributed and decentralized graph training
  • Scalable and efficient graph transformers
  • New Techniques for Graphs, Structures and Geometries
  • Acceleration for graph query and graph inference
  • Methodologies for heterophilic, heterogeneous, directed or imbalanced graphs
  • Riemannian/non-Euclidean/mixed-curvature graph models
  • Graph structure generation, and topological analysis on symmetry and equivariance
  • Model quantification, graph topological pattern injection, and graph knowledge distillation
  • Knowledge Transfer among Graphs
  • Cross-domain and few-shot graph knowledge transfer
  • Domain alignment in pretraining
  • Parameter-efficient graph fine-tuning (LoRA, adapters)
  • Prototype-based adaptation and test-time tuning for graphs
  • Trustworthy and Privacy on Graphs
  • Privacy-preserving graph neural networks
  • Adversarial attacks and graph poisoning
  • Explainability and interpretability on graph engineering
  • Causality and counterfactual learning on graphs
  • Real-world Applications 
  • Knowledge base and knowledge graphs
  • Drug discovery and models for molecular graphs
  • Financial transaction network analysis
  • Dynamic interacting systems and multiagent systems
  • Management of spatio-temporal graphs and transportation systems
  • Datasets and Benchmarking
  • Synthetic graph corpus generation
  • Graph foundation model benchmarking protocols
  • Open-world heterogeneous GFM benchmarks

Submission Instructions:

For author information and guidelines on submission criteria, visit the Author’s Information Page. Please submit papers through the IEEE Author Portal and be sure to select the special issue or special section name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts.

In addition to submitting your paper to TDKE, you are also encouraged to upload the data related to your paper to IEEE DataPort. IEEE DataPort is IEEE's data platform that supports the storage and publishing of datasets while also providing access to thousands of research datasets. Uploading your dataset to IEEE DataPort will strengthen your paper and will support research reproducibility. Your paper and the dataset can be linked, providing a good opportunity for you to increase the number of citations you receive. Data can be uploaded to IEEE DataPort prior to submitting your paper or concurrent with the paper submission.


Guest Editors

  • Philip S. Yu (Lead Guest Editor), University of Illinois Chicago, USA
  • Pietro Liò, University of Cambridge, UK
  • Li Sun, North China Electric Power University, China
  • De-Nian Yang, Academia Sinica, Taiwan
LATEST NEWS
The Cybersecurity & AI Junior School Workshop: Bridging the Digital Skills Gap for Future Innovators
The Cybersecurity & AI Junior School Workshop: Bridging the Digital Skills Gap for Future Innovators
Supply Chain Concepts in Health Information Management: Strategic Integration and Information Flow Optimization
Supply Chain Concepts in Health Information Management: Strategic Integration and Information Flow Optimization
The Road Ahead: Preparing for 2030’s Digital Oil & Gas
The Road Ahead: Preparing for 2030’s Digital Oil & Gas
Celebrating Innovation at TechX Florida 2025
Celebrating Innovation at TechX Florida 2025
Quantum Insider Session Series: Practical Instructions for Building Your Organization’s Quantum Team
Quantum Insider Session Series: Practical Instructions for Building Your Organization’s Quantum Team
Read Next

The Cybersecurity & AI Junior School Workshop: Bridging the Digital Skills Gap for Future Innovators

Supply Chain Concepts in Health Information Management: Strategic Integration and Information Flow Optimization

The Road Ahead: Preparing for 2030’s Digital Oil & Gas

Celebrating Innovation at TechX Florida 2025

Quantum Insider Session Series: Practical Instructions for Building Your Organization’s Quantum Team

Beyond Benchmarks: How Ecosystems Now Define Leading LLM Families

From Legacy to Cloud-Native: Engineering for Reliability at Scale

Announcing the Recipients of Computing's Top 30 Early Career Professionals for 2025

FacebookTwitterLinkedInInstagramYoutube
Get the latest news and technology trends for computing professionals with ComputingEdge
Sign up for our newsletter