Famous Graphics Chips: The Integrated Graphics Controller

By Dr. Jon Peddie
Published 02/28/2020
Share this on:

Following Moore’s law, integrated graphics have become quite powerful and popular

Integrated graphics have been with us since 1991 in the workstation space, and since 1995 in the PC, and earlier than that in workstations. They have now found their way into smartphones, tablets, automobiles, and game consoles.

pc architecture

Integrated graphics have evolved from being part of the chipset to being integrated within the CPU. Intel did that first in 2010. AMD followed them with the Llano in 2011, but with a much bigger and more powerful GPU. In-between, we saw a half dozen, or more clever innovative designs appearing from various suppliers, many of them no longer with us.

1991, May — One of the first examples of integrating a graphics controller with other components was the SPARC enhancement chipset from Weitek. This chipset consisted of two parts: the W8701 SPARC microprocessor and the W8720 Integrated Graphics Controller (IGC). The W8701 integrated a floating-point processor (FPP) into a SPARC RISC microprocessor. It ran at 40 MHz and was socket- and binary-compatible with the SPARC integer unit (IU) standard.

1995, June —Taiwan-based Silicon Integrated Systems introduced the SiS6204, the first PC-based integrated graphics controller (IGC) chipset for Intel processors. It combined the northbridge functions with a graphics controller and set the stage for a new category — the IGC.

SiS developed two IGCs, the 6204 for the 16-bit ISA bus, and the 6205 for the newer PCI bus. The graphics controller offered an integrated VGA with resolution up to 1280 ×1024 × 16.8 million colors (but interlaced), a  64-bit BitBLT engine with an integrated Philips SAA 7110 Video Decoder Interface that provided YUV 4:2:2 support, color-key video overlay support, color space converter, integer video scaling in 1/64th unit increments and VESA DDC1 and DDC2B signaling support . It offered an UMA capability in conjunction with SiS’s 551x UMA chipsets. However, most importantly, it proved what one could integrate into a small, low-cost chip. SiS and ALi were the only two companies initially awarded licenses to produce third party chipsets for the Pentium 4.

1999, January — In the late 1990s, workstation giant Silicon Graphics Inc (SGI), was trying to meet the on-coming threat of the popular and ever-improving X86 processors from Intel. SGI developed the Visual Workstation 320 and 540 workstations using an Intel Pentium processor and designed the Cobalt IGC. It was a massive chip for the times with over 1,000-pins and cost more than the CPU. It also showed what performance one could obtain with a unified memory architecture (UMA), one where the graphics processor shared the system memory with the CPU. It allowed up to 80 percent of the system RAM made available for graphics. However, the allocation was static and only adjusted via a profile.

1999, April — Intel had been leading the industry with integrating more functions and capabilities in with the CPU. In 1989 when it introduced the venerable 486, it incorporated an FPP, the first chip to do so. 10 years later, the company introduced the 82810 IGC (codenamed, Whitney).

1999, September — ArtX. David Orton, who led the development of the Cobalt chipset while VP of Silicon Graphics’ advanced-graphics division, left SGI and became President of ArtX. The company showed its first integrated graphics chipset with a built-in geometry engine at COMDEX in the fall of 1999, then marketed by Acer Labs of Taiwan. Seeing that, Nintendo contacted ArtX to create the graphics processor (called the Flipper chip) for fourth game console, the GameCube. Then in February 2000, ATI announced it would buy ArtX.

2001, June — SiS introduced transfer and lighting (T&L) to its IGC. Transformation is the task of producing a two-dimensional view of a three-dimensional scene. Clipping means only drawing the parts of the scene that are in the image after the rendering has completed. Lighting is the task of altering the color of the various surfaces of the scene based on lighting information. Arcade game system boards used hardware T&L since 1993, and by home video game consoles since the Nintendo 64’s Reality Coprocessor GPU (designed and developed by SGI) in 1996. Personal computers implemented T&L in software until 1999.

With the introduction of geometry processing and T&L, the IGC evolved into the IGP — integrated graphics processor.

2001, June — Nvidia introduced its IGP the nForce 220 for AMD Athlon CPU.

 

nvidia nforce igp

Figure 1: Nvidia’s nForce IGP (Source Nvidia)

 

The nForce was a motherboard chipset created by Nvidia for AMD Athlon and Duron (later included support in the 5 series up for Intel processors). The chipset shipped in three varieties; 220, 415, and 420. 220 and 420 were very similar, with each having the integrated GPU.

When Intel moved from a parallel bus architecture to a serial link interface (copying the hyperlink design from AMD), they also declared Nvidia’s bus license invalid. After a protracted legal battle, Nvidia won a settlement from Intel, and in 2012 exited the IGP market, leaving only AMD, Intel, and small Taiwanese supplier, Via Technologies. All the other companies in the market were either bought or driven out of the market by competition.

2002, January — ATI. Two years after its acquisition of ArtX, ATI introduced its first IGC, the IGP 320 (code named ATI A3) IGC.

athlon processor

Figure 2: AMD’s IGP for AMD Athlon processor (Source ATI)

 

Four years after ATI introduced their IGC, AMD bought ATI to develop a processor with a real GPU integrated. At the time, Dave Orten was ATI’s CEO. However, it proved harder to do than either company thought. Different fabs, different design tools, and most difficult, different corporate cultures.

2004, July — Qualcomm introduced its first integrated graphics processor in the new MSM6150 and MSM6550 using ATI’s graphics Imageon processor.

 

source qualcomm

Figure 3: Qualcomm’s SMS6550 SoC (Source Qualcomm)

 

The graphics processor could support100k triangles/second and 7M pixels/second graphics functionality for console-quality gaming and graphics.

2005, October — Texas Instruments introduced the OMAP 2420 and Nokia introduced in the N92 and then the N95.

 

integrated gpu

Figure 4: Texas Instruments’ OMAP 2420 SoC with integrated GPU circa 2007 (Source IT)

 

TI used an Imagination Technologies’ PowerVR GPU design for their OMAP processors. IT was successful with the OMPA in mobiles until about 2012 when Apple and Qualcomm-based phones took over the market.

2007, June and November — Apple introduced the iPhone in the United States in June, and Qualcomm introduced the Snapdragon S1 MSM7227 SoC in November of the same year. All the companies had developed SoCs with integrated GPUs, primarily for the smartphone market. Apple used Imagination Technologies’ GPU design, and Qualcomm used ATI’s mobile GPU Imageon technology. In January 2009, AMD sold its Imageon handheld device graphics division to Qualcomm.

2008 Nvidia introduced the Tegra APX 250 SoC with a 300 to 400 MHz integrated GPU and a 600 MHz ARM 11 processor. Audi incorporated it in their entertainment systems, and other car companies followed. In March 2017, Nintendo announced it would use the Tegra in the new Switch game console.

2010, January — In PC land, Intel beat AMD to it and introduced its Clarkdale and Arrandale processors with Ironlake graphics. Intel branded them as Celeron, Pentium, or Core with HD Graphics. The GPU’s specification was 12 execution units (shaders), up to 43.2 GFLOPS running at 900 MHz. The IGP could also decode an H264 1080p video at up to 40 fps.

Intel built the first implementation, Westmere, as a multi-chip product in a single case. The CPU using Intel’s 32 nm process, and the GPU using 45 nm.

 

intel gpu

Figure 5: Intel was first to incorporate a GPU in the same die with a CPU (Source Intel)

 

The most significant difference between Clarkdale and Arrandale is that the later had integrated graphics. Intel built the fully integrated 131 MM2 processor, Sandy Bridge, 4-core 2.27 GHz processor with the IGP in its 32nm fab.

2011, January — When AMD bought ATI, Hector Ruiz was president of AMD, and Dave Orton was president of ATI. Orton. Orto left AMD in 2007, and Ruiz left in 2008, the architects of the acquisition, and dream of building a CPU with integrated graphics. It three more years and two new CEOs after Ruiz left before AMD could introduce an integrated GPU-CPU, which they named an APU — accelerated processor unit. The first product, in 2011, was the Llano. The internal code name for the device was Fusion, and several people thought that should have been its marketing name, too, but there were already too many products in the world using that name.

 

amd gpu

Figure 6: AMD’s integrated CPU-GPU (Source AMD)

 

The Llano used the 4-core K10 x86 CPU and a Radeon HD 6000-series GPU on the same 228 mm2 die. AMD had it fabricated at Global Foundries in 32nm.

2013, November — Game consoles. Sony introduced PlayStation 4 (PS4), and Microsoft launched the Xbox One, both based on a custom version of AMD’s Jaguar APU. Sony used an 8-core AMD x86-64 Jaguar 1.6 GHz CPU (2.13 GHz on PS4 Pro) APU with an 800MHz (911MHz on PS4 Pro) GCN Radeon GPU. Microsoft an 8-core 1.75 GHz APU (2 quad-core Jaguar modules), and the X model had a 2.3 GHz AMD 8-core APU. The Xbox One GPU ran at 853 MHz, the Xbox One S at 914 MHz, and the Xbox One X at 1.172 GHz using AMD Radeon GCN architecture.

Today — The integrated GPU (iGPU) is THE most popular device. It is cost-effective (free), and powerful enough (“Good-enough”) for most tasks. It has found growing acceptance in power-demanding workstation market applications.

The iGPU is the dominate GPU used in PCs, it’s in 100% of all game consoles, 100% of all tablets, smartphones, and about 60% of all automobiles, about 2.1 billion units total.

GPUs are incredibly complicated and complex devices with hundreds of 32-bit floating-point processors (called Shaders) build with millions of transistors. It’s only because of the miracle of Moore’s Law (observation) that such things can be accomplished. Every day you engage with multiple GPUs, in your phone, your PC, your TV, your car, your watch, your game console, and through cloud. The world would not have progressed to where it is with the venerable and ubiquitous GPU.