• IEEE.org
  • IEEE CS Standards
  • Career Center
  • About Us
  • Subscribe to Newsletter

0

IEEE
CS Logo
  • MEMBERSHIP
  • CONFERENCES
  • PUBLICATIONS
  • EDUCATION & CAREER
  • VOLUNTEER
  • ABOUT
  • Join Us
CS Logo

0

IEEE Computer Society Logo
Sign up for our newsletter
FacebookTwitterLinkedInInstagramYoutube
IEEE COMPUTER SOCIETY
About UsBoard of GovernorsNewslettersPress RoomIEEE Support CenterContact Us
COMPUTING RESOURCES
Career CenterCourses & CertificationsWebinarsPodcastsTech NewsMembership
BUSINESS SOLUTIONS
Corporate PartnershipsConference Sponsorships & ExhibitsAdvertisingRecruitingDigital Library Institutional Subscriptions
DIGITAL LIBRARY
MagazinesJournalsConference ProceedingsVideo LibraryLibrarian Resources
COMMUNITY RESOURCES
GovernanceConference OrganizersAuthorsChaptersCommunities
POLICIES
PrivacyAccessibility StatementIEEE Nondiscrimination PolicyIEEE Ethics ReportingXML Sitemap

Copyright 2025 IEEE - All rights reserved. A public charity, IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity.

  • Home
  • /Digital Library
  • /Magazines
  • /Mi
  • Home
  • / ...
  • /Magazines
  • /Mi

CLOSED Call for Papers: Special Issue on Processing in Memory

Computer designers have traditionally separated the role of storage and compute units. Memories and caches stored data. Processor’s logic units computed on the data. Is this separation necessary? A human brain does not separate the two so distinctly. Why should a processor? The in/near-memory computing paradigm blurs this distinction and imposes the dual responsibility on memory substrates: storing and computing on data. Modern processors and accelerators have over 90% of their aggregate silicon area dedicated to memory. In/near-memory processing converts these memory units into powerful allies for massively parallel computing, which can accelerate a plethora of applications including neural networks, graph processing, data analytics, and genome sequencing. Further, these architectures offer an order of magnitude higher bandwidth to access data, and shave off data-movement costs.

This special issue of IEEE Micro will explore academic and industrial research on topics that relate to in-memory computing. Topics include, but are not limited to:

  • In/near-memory architectures for general-purpose processing
  • In/near-memory domain-specific accelerator architectures for machine learning, graph processing, data analytics, genomics, and other exciting applications
  • Emerging memory technologies for in/near-memory computing
  • Evaluation of industry/academic in/near-memory prototypes
  • Programming models, compilers, and data-offloading architectures
  • Interaction of in/near-memory components with CPU architecture
  • Multi-tier memory hierarchy architectures for in/near-memory computing
  • Memory bottlenecks for emerging data-centric applications
  • Memory-centric automata processing
  • In/near-memory computing for edge/IoT/embedded systems
  • Security implications of in/near-memory computing

Important Dates

Submission deadline: May 4, 2021

Initial notifications: June 29, 2021

Revised papers due: August 3, 2021

Final notifications: August 24, 2021

Final versions due: September 14, 2021

Publication: November/December 2021

Submission Guidelines

For the manuscript submission, acceptable file formats include Microsoft Word and PDF. Manuscripts should not exceed 6,000 words including references, with each average-size figure counting as 250 words toward this limit. Please include all figures and tables, as well as a cover page with author contact information (name, postal address, phone, fax, and email address) and a 200-word abstract. Submitted manuscripts must not have been previously published or currently submitted for publication elsewhere, and all manuscripts must be cleared for publication. All previously published papers must have at least 30% new content compared to any conference (or other) publication. Accepted articles will be edited for structure, style, clarity, and readability. Read IEEE Micro's full submission guidelines.

When you are ready to submit your manuscript, log into ScholarOne Manuscripts and submit your manuscript. Please direct ScholarOne-related questions to the IEEE Micro magazine assistant at micro-ma@computer.org.

Questions?

Contact guest editor Dr. Reetuparna Das (micro6-21@computer.org) or editor-in-chief Lizy Kurian John (ljohn@ece.utexas.edu).

LATEST NEWS
From Isolation to Innovation: Establishing a Computer Training Center to Empower Hinterland Communities
From Isolation to Innovation: Establishing a Computer Training Center to Empower Hinterland Communities
IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT
IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT
Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)
Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)
Autonomous Observability: AI Agents That Debug AI
Autonomous Observability: AI Agents That Debug AI
Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference
Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference
Read Next

From Isolation to Innovation: Establishing a Computer Training Center to Empower Hinterland Communities

IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT

Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)

Autonomous Observability: AI Agents That Debug AI

Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference

Copilot Ergonomics: UI Patterns that Reduce Cognitive Load

The Myth of AI Neutrality in Search Algorithms

Gen AI and LLMs: Rebuilding Trust in a Synthetic Information Age

Get the latest news and technology trends for computing professionals with ComputingEdge
Sign up for our newsletter