Meet Ian Foster of Argonne National Labs

How the Creator of “Grid Computing” is Revolutionizing AI and Big Data Research
Lori Cameron
Published 05/23/2019
Share this on:

Ian Foster, director of the Data Science and Learning Division at Argonne National Laboratory and winner of the 2019 IEEE Computer Society Charles Babbage Award, is a pioneer in the field of distributed computing. Together with his colleagues, he coined the term “grid computing,” the precursor for today’s field of cloud computing.IanFosterHeadshotNovember2014

His open-source Globus toolkit has been used for numerous ground-breaking research projects, including a collaborative project that harnessed the power of multiple supercomputers to simulate the gravitational effects of black hole collisions. The research team from Argonne National Laboratory, the University of Chicago, Northern Illinois University, and the Max Planck Institute for Gravitational Physics in Germany, won the prestigious Gordon Bell prize for its work.

We caught up with Foster to ask him about his accomplishments and future plans.

Computer Society: What was your reaction to having been chosen to receive the IEEE CS Charles Babbage Award?

Foster: I was of course delighted. Babbage has long been my hero, and I love the fact that there is an award in parallel computation named after him. Babbage was a splendid mix of the visionary and the practical, two things that I aspire to be.

Computer Society: Your career has involved a healthy cross-section of academic and industry pursuits. How do you do it?

Foster: I like to build things, and to sustain the things that we build when people find them useful. That often requires creative approaches to assemble the required resources and, just as importantly, to provide a home for the talented people needed to build and sustain. Sometimes a national laboratory is the right way to do this, sometimes a not-for-profit embedded within a university (Globus at the University of Chicago), sometimes a company (Univa, Praedictus Climate Solutions).

Ian Foster (right) of Argonne National Laboratory and the University of Chicago receives the Computer Society Charles Babbage Award for his "outstanding contributions in the areas of parallel computing languages, algorithms, and technologies for scalable distributed applications" at this week's IEEE International Parallel and Distributed Processing Symposium (IPDPS)! He receives the honors from IPDPS steering committee chair Viktor K. Prasanna, also of the University of Southern California.
Ian Foster (right) receives the Computer Society Charles Babbage Award for his “outstanding contributions in the areas of parallel computing languages, algorithms, and technologies for scalable distributed applications” at IEEE International Parallel and Distributed Processing Symposium (IPDPS) in May 2019. He accepts the honors from IPDPS steering committee chair Viktor K. Prasanna, also of the University of Southern California.

Computer Society: What’s the best career advice you can give, for any level—entry, mid-career, managerial?

Job-hunting? Subscribe to our Build Your Career newsletter.

Foster: Avoid mainstream thinking. Solve real problems. Work with smart people. Read. Write. I’m not sure whether that is the best advice or even good advice, but I try to follow it myself.

Computer Society: What are you most looking forward to at the IEEE International Parallel and Distributed Processing Symposium to be held in Rio de Janeiro?

Foster: I’m delighted to be in Brazil, for one thing: apart from one brief trip to Brasilia, I have never been there. The conference has lots of good talks. I’m particularly looking forward to the Friday workshop on Parallel AI and Systems for the Edge.

Want to stay on top of future conferences to help your business needs? Sign up for our conference alerts.

Computer Society: What’s next for Ian Foster?

Foster: I have a fairly new role at Argonne as director of the Data Science and Learning Division. We are at an exciting time for science, as artificial intelligence methods (that is, methods that allow computers to learn from data, rather than being programmed explicitly) become relevant to a growing number of research tasks. For computer scientists, these developments mean a lot of new challenges. Where do we get the data required to train science AI? How do we allow science AI to leverage the vast theoretical knowledge accumulated over hundreds of years? How do we organize science AI to permit productive human-AI partnerships? What computer hardware and software will be needed to harness, apply, and manage AI? I hope to help answer some of these questions.