IEEE Computer Society's Top 12 Technology Trends for 2020
Exclusive Content: Follow the below links to six peer-reviewed articles from Computer magazine's December 2019 issue.

Access the exciting discussion Web Chat on 2020 Tech Trends as the experts explore the 2020 technology predictions.
 

Here are the predictions for the Top 12 Tech Trends in 2020. The IEEE Computer Society has been predicting technology trends since 2015, and its annual forecast garners widespread attention for its authoritativeness. At the end of each year, the society also grades itself on its annual predictions with a scorecard, or report card, and they attract audiences as big as the forecasts themselves.

The following trends are predicted by our experts to reach adoption in 2020.


Top 12 Technology Trends

The following trends are predicted by our experts to reach adoption in 2020.

 

  1. Artificial Intelligence (AI) at the edge (AI@Edge). The past decade has seen an explosion of machine learning (ML) in our daily interactions with the cloud. The availability of massive crowd-sourced labeled data, the increase in computer power efficiency at lower cost, and the advances of ML algorithms lay the foundation of this disruption. As techniques improve and become robust enough to automate many activities, demand increases for using ML in new ways that are more pervasive than the initial cloud use cases. Combined with ubiquitous connectivity such as 5G and intelligent sensors such as the Internet of Things (IoT), ML applications will rapidly move to the “edge,” the physical world close to us all. In the upcoming years, we expect to see the widespread deployment of ML in areas that will have a far greater impact on our daily lives, such as assisted driving, industrial automation, surveillance, and natural language processing.

 

  1. Non-volatile memory (NVM) products, interfaces and applications. NVM Express (NVMe) SSDs will replace SATA and SAS SSDs within the next few years, and NVMe-oF will be the dominant network storage protocol in five years. NVMe enables NAND tiering technologies and programming functions that increase endurance, enable computational storage, and allow more memory-like access to data. Emerging memory technologies such as MRAM, ReRAM, and PCM will provide future higher performance NVMe devices.

 

  1. Digital twins, including cognitive twins. Digital twins are a reality in the manufacturing industry, and major IoT platforms such as Siemens MindSphere are supporting them. They’ve also become a widespread tool in complex system operations; they’ve been used with railways and power plants in cities since January 1, 2019. The Singapore administration uses digital twins for planning, simulation, and operations in Singapore. Cognitive digital twins are in the early stages of trial and experimentation.

 

  1. AI and critical systems. AI will be deployed increasingly in more systems that affect public health, safety, and welfare. These systems will better utilize scarce resources, prevent disasters, and increase safety, reliability, comfort, and convenience. Despite the technological challenges and public fears, these systems will improve the quality of life for millions of people worldwide. Within five years, there will be a significant increase in the application of AI in critical infrastructure systems, or “critical systems,” that directly affect the public and in which failure could cause loss of life, serious injury, or significant loss of assets or privacy. Critical systems include power generation and distribution, telecommunications, road and rail transportation, healthcare, banking, and more.

 

  1. Practical delivery drones. Parcel delivery is an industry of enormous economic impact, and yet has evolved relatively slowly over the decades. It can still be frustratingly slow, wasteful, labor-intensive, and expensive. These inefficiencies, combined with recent developments in drone technology, leave the field ripe for disruption. Several companies have recently worked to develop practical delivery drones, which may now be ready to completely transform this industry — and consequently, society as a whole.

 

  1. Additive manufacturing. 3D printing has existed since at least the early 1980s, but it has been confined largely to part prototyping and small-scale production of special-purpose or exotic pieces. Currently, new processes, materials, hardware, software, and workflows are bringing 3D printing into the realm of manufacturing, especially for mass customization. Unlike traditional manufacturing, additive manufacturing makes it economically viable to produce a high volume of parts where each one is different. For instance, companies such as SmileDirect now use 3D printers to generate tens of thousands of molds daily, each customized to make an orthodontic aligner for an individual person. Stronger and more robust materials, finer resolution, new finishing techniques, factory-level management software, and many other advances are increasing the adoption of 3D printing in industries such as healthcare, footwear, and automotive. In 2020, we expect to see this trend continue as other industries discover the benefits of mass customization and the opportunity to print parts that are not easy or affordable to produce using traditional means.

 

  1. Cognitive skills for robots. Robots are spreading more and more from the manufacturing floors into spaces occupied by humans. There is a need for robots in such environments to be able to adapt to new tasks through capabilities such as increased comprehension of the environments within which they’re situated. We predict that recent breakthroughs in large-scale simulations, deep reinforcement learning, and computer vision, collectively will bring forth a basic level of cognitive abilities to robots that will lead to significant improvements in robotic applications over the next few years.

 

  1. AI/ML applied to cybersecurity. Cybersecurity is one of the key risks for any business today. The growing attack surface includes amateur threats, sophisticated distributed denial of service attacks, and skilled nation-state actors. Defense depends on security analysts, but many of this rare breed lack adequate training, and the positions have high turnover rates. AI/ML can help detect threats and offer recommendations to security analysts, driving response times from hundreds of hours down to seconds and scaling analyst effectiveness from one or two incidents to thousands daily. It can preserve corporate knowledge and use it to automate tasks and train new analysts. We predict advancing adoption of AI/ML applied to cybersecurity through a partnership among members of industry, academia, and government on a global scale.

 

  1. Legal related implications to reflect security and privacy. Data collection and leveraging capabilities are becoming more sophisticated and sensitive, often incorporating live feeds of information from sensors and various other technologies. These enhanced capabilities have yielded new streams of data and new types of content that raise policy and legal concerns over possible abuse: nefarious actors and governments can repurpose these capabilities for reasons of social control. Similarly, new technology capabilities also strain the abilities of average people to discern the difference between legitimate and fraudulent technology content, such as accepting an authentic video versus a “deep fake.” As such, the next year will prove critical to maintaining the fragile balance between preserving the social benefits of technology, on the one hand, and preventing undesirable repurposing of these new technology capabilities for social control and liberty deprivation, on the other. More aggressive legal and policy tools are needed for detecting fraud and preventing abuse of these enhanced technology capabilities.

 

  1. Adversarial Machine Learning (ML). ML generally assumes that the environment is not maliciously manipulated during the training and evaluation of models. In other words, most ML models have inadequately considered the ways in which an adversary can attack and manipulate the model’s functionality. Yet, security researchers have already demonstrated that adversarial, malicious inputs can trick machine learning models into undesired outcomes, even without full information about a target model’s parameters. As ML becomes incorporated into other systems, the frequency of malicious attacks on ML will rise. As such, security research into adversarial machine learning and countermeasures aimed at detecting manipulation of ML systems will become critically important. Similarly, recognition of ML systems’ fallibility and manipulability will begin to inform policymaking and legal paradigms.

 

  1. Reliability and safety challenges for intelligent systems. Intelligent systems, capable of making autonomous decisions, are nowadays attracting increased economic investment worldwide. We expect that they will be increasingly adopted in several fields, including smart cities, autonomous vehicles, and autonomous robots. Depending on the application field, the autonomy of intelligent systems has been formalized by defined levels. Of course, the higher the level of intelligence and consequent autonomous capabilities, the stronger the requirements in terms of reliability and safety for the intelligent systems’ operation in the field, where reliability is defined as the likelihood of correct operation for a given amount of time, while safety refers to the ability to avoid catastrophic consequences on the environment and users. Guaranteeing the required high levels of reliability and safety that are mandated for highly autonomous intelligent systems will be one of the major technological challenges to be faced in 2020, to enable a smarter world.

 

  1. Quantum Computing. The quest for practical quantum computing will move forward in 2020, yet remain incomplete. At the beginning of 2020, experimental quantum computer demonstrations consume about 1/10,000 the energy of the world’s largest supercomputers while outperforming them by 1,000x or more–yet the demonstrated applications look like quantum computer self-tests. If quantum computers are destined to be successful, they will come about by increasing relevance and generality, having already demonstrated a computational advantage. We project demonstrations to become more compelling in the next year. For example, a quantum computer might perform a chemical simulation that’s impossible by any standard supercomputer, leading to a more nuanced debate about whether the chemical that may be discovered will be useful to society.

 


Six Peer-Reviewed Articles Published in Computer magazine

Click on each link to download an in-depth article on six of the new technologies predicted for 2020:

Grand Challenge: Applying Artificial Intelligence and Machine Learning to Cybersecurity

Security and Privacy in the Age of Big Data and Machine Learning

Digital Twins: Bridging Physical Space and Cyberspace

Practical Drone Delivery

Nonvolatile Memory Express: The Link That Binds Them

Cognitive Robotics: Making Robots Sense, Understand, and Interact


Our Flagship Computer Magazine

Our Top 12 Tech Trends are highlighted in a special issue of our membership magazine, Computer.

Access here the Computer magazine’s special December 2019 issue for more on 2020 Tech Trends.

17 December 2019 Webinar

The authors of the Top 12 Tech Trends of 2020 report will discuss their work in webinar on 17 December 2019.

Sign up here for the free Web Chat on December 17, when they will discuss the 2020 technology predictions.

The Annual Scorecard on the 2020 Tech Trends Predictions

At the end of 2020, the Computer Society tech experts will review these predictions to determine how closely they match up to technology’s reality. Check back in December 2020 as IEEE CS grades its latest predictions.

For past technology forecasts, visit the 2019 technology predictions and view the 2019 prediction scorecard for evaluations and grades of our predictions.

 

Become a Member and Save on Publications

  • Learn what’s new and what’s next with access to Computer magazine.
  • Access a wealth of computing information – at your fingertips with the Computer Society Digital Library.