Chasing Pixels - Finding Gems - Home
It’s Not All About the GPU Anymore
JUL 17, 2018 01:37 AM
A+ A A-

It’s Not All About the GPU Anymore

by Dr. Jon Peddie
 
Evolving Future of Processors
 
We’ve been racing to keep pace with Moore’s Law for decades. GPUs have become our daily workhorse for visualization and aspects of compute. We’re close to adding ASICS, FPGAs and quantum computing to that mix.  Where is this all heading? What other aspects of computation need to be on our radar? What aspects are going to make a difference and why? When should I care?
 
Edge Computing
 
 
Evolution vs. Revolution
 
The hyper-compute-dense GPU has been a marvel of novel workload accelerators. Who knew when programmable vertex shaders were first introduced by 3Dlabs in 2002, and the TI TMS3410 16 years earlier in 1986, that programmable graphics processors would find their way into supercomputers, scientific instruments, autonomous vehicles, neural nets and machine learning, and inferencing machines we carry in our pockets.
 
The GPU with its massive parallel processing capability, over 5,000 cores such as Nvidia’s latest bemouth that uses over 3 billion transistors, has become not just the darling of the industry, so much so that even Intel has finally gotten in the game, that scarcely a day goes by without some new announcement of its application.
 
But despite the applicability, and love affair with the GPU, the industry, nay the world, has shifted in the last 18 to 24 months. Whereas it used to be software was king and we had three tried and true processors: the venerable x86, ARM, and the GPU, now we have a plethora of new processors being developed to enable and employer the exploding area of artificial intelligence, machine learning, robots, and autonomous things.
 
The big news and excitement about processors this year revolves around four major technology areas, led by applications, as they should be (rather than new processor designs looking for applications). Those application segments (which have dozens of subsegments), in alphabetical order, are:
  • Artificial intelligence
  • Blockchain
  • Cryptocurrency
  • Internet of Things
And unlike the evolution and introduction of applications in the past, which were built on the platforms available at the time, these new applications are demanding and inspiring new architectures and processors. We’re also seeing the clever application of existing processors to the new applications.
 
Interesting Interrelatedness
 
The other interesting thing about these new applications is how interrelated they are. Artificial intelligence (AI), which is also often referred to as machine and deep learning, relies on and requires a large sample base, often referred to as big-data. Internet of Things (IoT) devices, often called smart sensors, generate large quantities of data. That data can be effectively, efficiently, and securely captured, stored, and distributed via blockchain mechanisms. And if the data, or the AI training has to be paid for, it can be done via cryptocurrency exchanges.
 
AI
 
Artificial intelligence is one such application that was originally run using x86 servers. Because of the data nature of AI, it was soon learned that a massive and low-cost parallel processor like the GPU could be applied to the applications. But even the GPU, with its incredible compute density and computer efficiency, was good enough and so some organization developed application-specific solutions using FPGAs and ASICs for the convolutional neural network (CNN) workloads. The ASICs have various names, probably the most well-known one being Google’s Tensor processor, or TPU, and the Tensor cores Nvidia added to their GPU in the Volta processor.
 
 
Edge Computing
Figure 1: Google has developed and made available their TensorFlow library and AI examples, which are used as a sort of benchmark.
 
Other examples can be found such as Intel’s Nervana. IBM developed the TrueNorth Neuromorphic CMOS ASIC in conjunction with the DARPA SyNAPSE program, and other companies such as ST, HiSilicon, Rockchip, and MediaTek have developed AI-CNN processors.
 
Training
 
When you start listing AI processor suppliers, you have to segregate them into training and inferencing applications. The big “iron” processors like AMD’s IBM’s, Intel’s, and Nvidia’s are used for sucking in massive amounts of big data to train an algorithm on how to find cats, terrorists, or glaucoma.
 
Inferencing
 
Once the algorithms have been trained and tuned, they can then be applied to smaller processors such as the type made by HiSilicon, MediaTek, Nvidia, Qualcomm, Rockchip, ST, and others to do inferencing. Examples would be facial recognition of you for security sign in, or recognizing Alexa’s name and an instruction. The work commissioned by the instruction (Alexa, what time is it in Moscow) is done in the cloud on big AI machines.
 
Blockchain
 
Blockchain is a virtual application, in that it doesn’t run on just your computer, but on everyone’s computer.
 
Blockchain network are simply lots of virtual machines or “nodes” connected to every other node to create a mesh. Each node runs a copy of the entire blockchain and competes to mine the next block or validate a transaction. Whenever a new block is added, the blockchain updates and is propagated to the entire network, such that each node is in sync.
 
Block chaining is a distributed ledger. There are free and commercial blockchain programs one can use and customize for individual needs such as Ethereum, MultiChain, and HyperLedger.
 
Ethereum and MultiChain products that claim to be open to some degree. HyperLedger was developed by IBM and given to the Linux Foundation. The licensing is not yet clear on these programs so one needs to investigate before implementing.
 
To become a node in a network, one’s computer has to download and update a copy of the entire blockchain. To achieve this, blockchain applications like HyperLedger or Ethereum provide tools that you can download, connect to the specific block chain network, and then interact with it.
 
Because of the mesh nature of blockchaining, GPUs have proven to be particularly good.
 
Cryptomining
 
For a blockchain transaction to work, it has to be verified. The verification process can be done by anyone, and those doing it charge a fee for the verification. People set up their computer to search the web looking for open or waiting transactions. That is known as blockchain mining. And since the token of payment for providing the verification is a cryptocurrency, it has become known as cryptocurrency mining, or simply cryptomining.
 
To use a blockchain, you need a special driver for the processor you want to use. Typically, a GPU is used, and so you can get a blockchain driver from AMD, Intel, or Nvidia for their GPUs. Those drivers are used for cryptomining.
 
Internet of Things
 
We already live in a world of sensors, counters, and taggers. Modern factors, hospitals, automobiles and airplanes, most homes, and businesses have dozens of sensors to measure temperature, door openings, speed of rotating devices, pressure, humidity, color, etc. Data is also collected by point of sale (POS) devices. All that data is sent to servers in a continuous or impulse manner depending upon occurrences and location intelligence. For example, there’s no need to report the steady state temperature or rotation of a machine more often than maybe once an hour, but a potentially critical need if it changes in a fraction of a second. 
 
Internet of Things devices, despite their tiny size and ubiquitous deployment, are being upgraded with smart sensors capable of wireless communications, and in some cases without power.
 
And these smart sensors and POS terminals spew out data every day; in some cases, every day, all day, which leads us back to AI.
 
Robots
 
Robots might be considered an application, more likely a system or device when a physical manifestation is envisioned. However, there are hundreds of software robots, such as telephone answering menu systems with voice recognition, and bots that post Twitter comments.
 
Robots of course need AI training to function. And if it’s a physical robot, it will have lots of sensors, and their data collection may be used in real time correction (and/or protection), and potentially fed to a server for further data analysis and program refinement. You can think of an autonomous vehicle as a robot. 
 
Summary
 
The number and types of applications, opportunities, and challenges being presented in our modern world are mind-boggling and difficult to keep up with, let alone be expert in them. As new concepts and vocabularies are introduced, so will confusion and misunderstanding about the terms, devices, functions, and dangers. One thing that is clear is that one size or type of processor does not fit all applications or needs, and we will have dozens of similar and specialized processors, most of which we will not even be aware of, nor should we be if they are to do their job.
 
Moore’s Law has been the engine empowering these science-fiction like developments and although some of the economic certainties of Moore’s Law may have slowed, the overall benefit of high-density, low-cost, nano-scale compute capabilities has not.
 
Life will get much better, maybe more complicated and challenging, but better overall. 
 
Edge Computing
 
Dr. Jon Peddie is one of the pioneers of the graphics industry and formed Jon Peddie Research (JPR) to provide customer intimate consulting and market forecasting services where he explores the developments in computer graphics technology to advance economic inclusion and improve resource efficiency.
 
Recently named one of the most influential analysts, Peddie regularly advises investors in the technology sector. He is an advisor to the U.N., several companies in the computer graphics industry, an advisor to the Siggraph Executive Committee, and in 2018 he was accepted as an ACM Distinguished Speaker. Peddie is a senior and lifetime member of IEEE, and a former chair of the IEEE Super Computer Committee, and the former president of The Siggraph Pioneers. In 2015 he was given the Life Time Achievement award from the CAAD society.
 
Peddie lectures at numerous conferences and universities world-wide on topics pertaining to graphics technology and the emerging trends in digital media technology, as well as appearing on CNN, TechTV, and Future Talk TV, and is frequently quoted in trade and business publications,
 
Dr. Peddie has published hundreds of papers, has authored and contributed to no less than thirteen books in his career, his most recent, Augmented Reality, where we all will live, and is a contributor to TechWatch, for which he writes a series of weekly articles on AR, VR, AI, GPUs, and computer gaming. He is a regular contributor to IEEE, Computer Graphics World, and several other leading publications.
 
FIRST
PREV
NEXT
LAST
Page(s):
[%= name %]
[%= createDate %]
[%= comment %]
Share this:
Please login to enter a comment:
 
RESET