• IEEE.org
  • IEEE CS Standards
  • Career Center
  • About Us
  • Subscribe to Newsletter

0

IEEE
CS Logo
  • MEMBERSHIP
  • CONFERENCES
  • PUBLICATIONS
  • EDUCATION & CAREER
  • VOLUNTEER
  • ABOUT
  • Join Us
CS Logo

0

IEEE Computer Society Logo
Sign up for our newsletter
IEEE COMPUTER SOCIETY
About UsBoard of GovernorsNewslettersPress RoomIEEE Support CenterContact Us
COMPUTING RESOURCES
Career CenterCourses & CertificationsWebinarsPodcastsTech NewsMembership
BUSINESS SOLUTIONS
Corporate PartnershipsConference Sponsorships & ExhibitsAdvertisingRecruitingDigital Library Institutional Subscriptions
DIGITAL LIBRARY
MagazinesJournalsConference ProceedingsVideo LibraryLibrarian Resources
COMMUNITY RESOURCES
GovernanceConference OrganizersAuthorsChaptersCommunities
POLICIES
PrivacyAccessibility StatementIEEE Nondiscrimination PolicyIEEE Ethics ReportingXML Sitemap

Copyright 2025 IEEE - All rights reserved. A public charity, IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity.

  • Home
  • /Publications
  • /Tech News
  • /Chasing Pixels
  • Home
  • / ...
  • /Tech News
  • /Chasing Pixels

Famous Graphics Chips: Nvidia’s GeForce 256

By Dr. Jon Peddie

By Dr. Jon Peddie on
February 25, 2021
The first fully-integrated graphics processor unit — GPU

GE ForceGE ForceImage credit Konstantin Lanzet, Wikipedia

The term GPU has been in use since at least the 1980s. Nvidia popularized it in 1999 by marketing the GeForce 256 add-in board (AIB) as the world’s first GPU. It offered integrated transform, lighting, triangle setup/clipping, and rendering engines as a single-chip processor. Very-large-scale integrated circuitry—VLSI, started taking hold in the early 1990s. As the number of transistors engineers could incorporate on a single chip increased almost exponentially, the number of functions in the CPU and the graphics processor increased. One of the biggest consumers of the CPU was graphics transformation compute elements into graphics processors. Architects from various graphics chip companies decided transform and lighting (T&L) was a function that should be in the graphics processor. The operation was known at the time as transform and lighting (T&L). A T&L engine is a vertex shader and a geometry translator—many names for the little FFP. In 1997 3Dlabs (in the UK) developed its Glint Gamma processor, the first programmable transform and lighting engine as part of its Glint workstation graphics chips and introduced the term GPU—geometry processor unit. 3Dlabs’ GPU was a separate chip named Delta and was known as the DMX. 3Dlabs’ GMX was a co-processor to the Glint rasterizer. Then in October 1999, Nvidia introduced the NV10 GPU with an integrated T&L engine for their consumer graphics chip. ATI quickly followed with their Radeon graphics chip and called it a visual processing unit—VPU. But Nvidia popularized the term GPU and has forever since been associated with it and credited with inventing the GPU. Built on TSMC’s 220 nm process the 120 MHz NV10 had 17 million transistors in a 139 mm² die and used DirectX 7.0. The GeForce 256 AIB employed the NV10 with SDR memory. Before Nvidia started offering Nvidia branded AIBs, the company relied on AIB partners to build and sell the boards. However, Nvidia did offer a reference design to its OEM partners. The first AIB to use the 64 MB SDR was ELSA’s ERAZOR X which used its own design to create a NLX form factor board. elsa boardelsa board Figure 1: Elsa NV10-baed Erazor X AIB (Source Hayes-Wikipedia) The GPU had a large 128-bit memory interface, and could use DDR or SGRAM memory, a choice made by OEM board partners and usually done as a price-performance trade-off. The AIB shown in the above image has four 8MB SGRAM chips. Since it was a 1999 AIB, it used the AGP 4X interface with sustained DMA, and supported the Direct3D 7.0 API and OpenGL 1.2.1 with transform & lighting. geforce figuregeforce figure Figure 2: Nvidia GeForce 256 (NV10) view of OpenGL (source Nvidia/SIGGRAPH Asia 2008) The chip had many advanced features including four independent pipelined engines that ran at 120 MHz. That allowed the GPU to produce a 480 Mpix/sec fill rate. The video output was VGA and came with hardware alpha-blending and was HDTV (1080i) compliant. In addition to advanced graphics features, the chip also had a powerful video processing capability. It had a TV out capability and integrated NTSC/PAL encoders. I supported S-VHS and Composite video-input, and stereo 3D.

Summary

Integrating transform and lighting capability into the GPU was a significant differentiator for the GeForce 256. Prior to it, and the stand-alone T&L processor from 3Dlabs, previous and competitive 3D accelerators used the CPU to run those functions. Incorporating the T&L capability reduced cost for consumer AIBs while simultaneously improving performance. Prior to the GF256, only professional AIBs designed for CAD had a T&L co-processor engine. It also expanded Nvidia’s market by allowing the company to professional graphics market. Nvidia marketed those AIBs as Quadro. The Quadro AIB used the sameNV10 as the GeForce AIBs and used certified drivers for various professional graphics applications.

Want more tech news? Subscribe to ComputingEdge Newsletter today!

Jon Peddie, is a recognized pioneer in the graphics industry, president of Jon Peddie Research and named one of the most influential analysts in the world. He lectures at numerous conferences and universities on topics pertaining to graphics technology and the emerging trends in digital media technology. Former president of Siggraph Pioneers, he serves on advisory boards of several conferences, organizations, and companies, and contributes articles to numerous publications. In 2015, he was given the Life Time Achievement award from the CAAD society. Peddie has published hundreds of papers, to date; and authored and contributed to 11 books, His most recent, Ray Tracing: A tool for all.

Disclaimer: The author is completely responsible for the content of this article. The opinions expressed are their own and do not represent IEEE's position nor that of the Computer Society nor its Leadership.

LATEST NEWS
IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT
IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT
Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)
Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)
Autonomous Observability: AI Agents That Debug AI
Autonomous Observability: AI Agents That Debug AI
Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference
Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference
Copilot Ergonomics: UI Patterns that Reduce Cognitive Load
Copilot Ergonomics: UI Patterns that Reduce Cognitive Load
Read Next

IEEE Uganda Section: Tackling Climate Change and Food Security Through AI and IoT

Blockchain Service Capability Evaluation (IEEE Std 3230.03-2025)

Autonomous Observability: AI Agents That Debug AI

Disaggregating LLM Infrastructure: Solving the Hidden Bottleneck in AI Inference

Copilot Ergonomics: UI Patterns that Reduce Cognitive Load

The Myth of AI Neutrality in Search Algorithms

Gen AI and LLMs: Rebuilding Trust in a Synthetic Information Age

How AI Is Transforming Fraud Detection in Financial Transactions

FacebookTwitterLinkedInInstagramYoutube
Get the latest news and technology trends for computing professionals with ComputingEdge
Sign up for our newsletter