Issue No.01 - January/February (2011 vol.9)
Published by the IEEE Computer Society
Gary McGraw , Cigital
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MSP.2011.18
Gary McGraw interviews Paul Kocher, president and chief scientist of Cryptography Research. Among his many accomplishments, he helped design SSL 3.0, Electronic Frontier Foundation's DES cracker (nicknamed "Deep Crack"), and parts of the content security system found on Blu-ray discs. He continues to design crypto protocols and applications, which are then successfully deployed in real-world systems. Hear the full podcast at www.computer.org/security/podcasts/ or www.cigital.com/silverbullet. The Web extra is a full transcript of the interview.
Paul Kocher is president and chief scientist of Cryptography Research. Among his many accomplishments, he helped design SSL 3.0, Electronic Frontier Foundation's DES cracker (nicknamed "Deep Crack"), and parts of the content security system found on Blu-ray discs. He continues to design crypto protocols and applications, which are then successfully deployed in real-world systems.
Hear the full podcast at www.computer.org/security/podcasts/ or www.cigital.com/silverbullet.
Gary McGraw: Security analysis benefits from both a big-picture, system-level view of a target and an understanding of how various levels and layers in a system interact. Is this analysis capability something that you think can be taught?
Paul Kocher: It certainly can be learned. I think with any kind of an art, whether it be cryptography or painting, you have to like what you're doing, and you can't learn it by just being forced to sit through half a dozen classes and be done. It's gotta be something that you live and breathe, and when you try to fall asleep at night, it's going through your head, sort of percolating your membranes as you try to figure out the issues. Anybody who's got that enthusiasm, I think, can learn this, assuming you have some math aptitude.
You mentioned something interesting about layers, though, and it's probably worth talking about a little more, in that—at least in my view—the way that layers in a system and layers of abstraction work with security applications is completely different from traditional applications, because layers hide problems and in many ways are the thing that we're trying to cope with. In other areas, if you can abstract a problem, then great—you don't have to worry about what sits underneath. With security, you have to worry about every single thing underneath you in the stack, all the way down to the transistors, and if you forget something or screw something up, you're likely to have a catastrophic problem.
McGraw: It's a good stomping ground for potential attack, to think about those layers and think about the assumptions in particular that a higher layer may make, which you can make go away as an attacker.
Kocher: Absolutely, and if you want to find vulnerabilities, you almost never find them sitting at a single layer or a single piece designed by one person. It's when things interact across those layers that you get all the really interesting results.
McGraw: My opinion has changed over the years regarding whether engineers and architects of systems need to be taught to think like a bad guy. What's your opinion?
Kocher: I think what you need to do to be a good architect involves thinking differently from a bad guy because the bad guy's a hero if he finds one way in. If you have a 1 percent success rate as an attacker, you're tremendously successful. But if you're a defender, and you have a 99.9 percent success rate at building components that work together, and you've got thousands of components in your system, you're a complete failure because your systems are going to have vulnerabilities. If 1 percent of the attacks succeed against your system, you're a complete failure. You have to be able to think much more comprehensively and much more completely as a defender. That said, there are certain things, like understanding how attackers will find one little chink in the armor and then use that to find another and build a series of different steps to get through a defense that otherwise looks pretty strong, that you've really got to understand. Is that your opinion?
McGraw: Pretty much. I used to believe that all system designers, developers, and architects needed to know something about attacks—in fact, I thought they needed to know a lot about attacks. But I think, in some sense, the skill of building things defensively is equally important, and if you teach that skill properly, it may not require getting into the particulars of certain attacks. Does that make sense?
Kocher: Definitely. If you use AES as a cipher, it means you don't need to know every single form of cryptanalysis, but you sure better understand the ways that different modes of operation work. There are certain things you can exclude and other things you have to worry about.
McGraw: You designed parts of SSL version 3 as a consulting project for Netscape; at that time, processor speed was a serious constraint. But now that it isn't, would you do anything differently? Or was the speedup dictated by Moore's law factored into the design?
Kocher: Performance in crypto is very, very rarely a problem. I mean, there are a few massive websites that have issues, but even when we were doing the design [back] then, absent denial-of-service issues against the Web server, there really weren't too many concerns. Your Web browser in those days was usually connecting through a 56-Kbit modem, but your PC's CPU was more than fast enough to do crypto, so it really wasn't a significant overhead.
In many ways, a lot of attention gets paid to performance in security systems, but really what users want at the end of the day is to know that their computer isn't going to get screwed up. So if you save them a millisecond here and a millisecond there, but they end up spending an entire day rebuilding their hard drive, you haven't served anybody's objectives.
In doing that design, there were some performance things that I was paying attention to. There's a session resumption capability in that protocol that can shave off a round trip, but aside from that, performance really wasn't a major concern. I don't think it is today, as long as you're using normal algorithms and using them properly.
McGraw: You also designed the DES cracker. What's easier or more fun: breaking a system or designing a system that's hard to break?
Kocher: They're both fun. The DES cracker project was actually as much political as technical to show that 56-bit keys were inadequate and the export controls that were being imposed were really just causing harm for the good guys and not stopping bad guys who had any kind of a budget.
What gets me out of bed in the morning is building systems that you're going to see in the real world, that you're going to use, that are going to stop bad guys from doing things that they want to do. A protocol like SSL, with all of its warts—and I'll be the first to admit that it has plenty of them—is really neat because it's something that people use on a daily basis. It enables people to do transactions online that you wouldn't otherwise be able to have trust in. It's a building block used in a lot of places, and I really appreciate the opportunity to work on systems like that.
McGraw: Getting back to the attacker perspective, I'm wondering whether you think there are complementary skills here between people who break systems and people who design them, and whether it's rare that they're present in an individual. It certainly is the case that many individuals can do this, but is it a rare thing? Or is it a common thing?
Kocher: I don't know. In order to do design, you have to be able to deal with things like politics and businesses and business constraints and contracts and managing engineering teams, but breaking things, in most cases, [involves a] guy in his basement; it's a solitary, individual type of a pursuit. So there are big differences between the two.
McGraw: You sold part of your company to Macrovision for $60 million, which included the Blu-ray security design. What caused you to hit on the notion of putting protection code on the content disc itself instead of in the player?
Kocher: I looked at the security of the DVD format when it first came out and the really obvious things everybody knew about: bad algorithms, bad key sizes, and so forth. But one of the things that was immediately obvious was that the wrong people were writing the security software. You had player makers who have no interest in solving Hollywood's problems responsible for securing [content]? Why should I spend money to protect your house from being broken into? I might, if I'm a nice guy, do a little token effort, but I spend a lot more work on protecting my house than I do protecting my neighbor's house. And with content protection, in many ways it's even worse than the device makers not caring. In some cases, they actually have consumers who want to pirate stuff, so they have an incentive in that case to cater to that demand and intentionally make their security bad. So we built a system where we put security software on each disc—that way, it could be written by the studio, and it could change as attacks change, because inevitably they do, and simplify the problem for player makers because they were no longer the sole entity responsible for security.
McGraw: I have to tell you, I used to curse you with some regularity, Paul, because I had an early Blu-ray player that wasn't connected to the Internet, and so I had to re-burn its BIOS pretty consistently whenever a new design came out.
Kocher: Actually, that had nothing to do with the security—it had to do with the interactive component to Blu-ray because there's a big joggle there, which is orders of magnitude more complicated.
McGraw: Okay, I take back all that cursing then.
Kocher: No, just understand you should curse somebody else, not me.
McGraw: That was a pretty good idea, though, I think, and it shows kind of the way that you blend a technical approach that's very savvy and correct with this notion of a business model. That's an incredibly rare skill. In addition to being a world-class engineer and designer, you're an exceptional entrepreneur, so tell us a bit about Cryptography Research's organic growth. I find it astounding that your annual revenue per employee target is $4 million.
Kocher: Well, it's gone down as we've grown. A lot of what we've been able to do is, in many ways, just luck. I went to college just as the dot-com boom was forming and graduated just as the curve was about to go ballistic. If you look at the history of cryptography, there's been probably no more exciting period than the one that I've been fortunate enough to work in. Combine that with having the good luck to find research results that actually could ultimately generate money—understanding the patent system well enough to get paid for that work.
I was supposed to become a veterinarian. That was my plan when I went to college, so in any other decade, I would be a veterinarian today. There's a degree of being in the right place at the right time. There's also an element of looking at research and being able to find things that meet a whole bunch of different constraints at once. It needs to be an interesting problem to keep myself and the others here engaged with it. It needs to be something with a technical element. It needs a business case that works. It can't have political problems.
Security's a difficult place to make money because you've got a lot of engineers who don't understand what they don't know, and that sounds like a double negative, but it's actually not, really. If you don't understand how things can go wrong, you don't value or understand what somebody's going to do to try to fix them. You've got entrenched interests that often don't want to use somebody else's security technology. Everybody wants to manage the keys themselves, yet almost nobody's actually competent to do that, and you have the same thing with design. It's a tough area to work. But when you do find something that's important, people do deploy it and really do want security. It's just a question of building technology that works properly.
McGraw: Tell us a little bit about the philosophy that drives Cryptography Research and how you've grown over the years and attract good people.
Kocher: Well, hiring is by far our biggest problem, and there are three main things that I look for in people that we hire. One, is this person technically—at least, for a technologist—brilliant? It's quite hard to tease that out in an interview. Maybe you know the work that somebody has done.
[Next, the person] has to be fun to work with. Life's too short to spend time working with people who aren't fun. And [finally], they have to communicate well because security, to a large degree—especially on the design side—is about communication. If you are the mad genius who comes up with something nobody can understand, it's completely useless on the design side because you have to show your customer why it's secure; you have to show their customers why it's secure. You have to deal with multiple companies if it's going to be a standard. You have to make sure that you've made correct assumptions about lower layers of abstraction and the people who are using what you build understand what you've produced, or they'll have security failures.
McGraw: The real challenge is finding an individual who has all three of those skills.
Kocher: Absolutely. If you only needed two, it would be much easier to find people.
McGraw: With regard to trying to grow, you've always been fairly conservative, and it's paid off very handsomely. Do you think that this is just timing, like you were talking about before? Or do you think that this is a business model that, in contrast with, say, venture-capital-driven companies, is something that's really worth thinking more carefully about?
Kocher: By growing slowly, there are a lot of projects we've not pursued, and probably some of them would have paid off if we had pursued them. Having been through the most recent recession and also the dot-com crash and navigated a company through it, having a company where you've grown to be exactly the optimal size given your optimistic market projections is a pretty scary thing. Whereas if you've got a company where you're turning down projects, you know that you're pursuing a third or quarter of the opportunities that come along that look the most attractive—if a recession comes along, maybe that ratio goes up a bit. But it's a much more stable and much less stressful way to manage a group.
McGraw: More head room, plus, as you said before, interesting work is really the key. It's not about piling up the cash.
Kocher: Absolutely. I mean I'm not somebody who wakes up in the morning and says, "How can I make enough money to buy X?" I reached the point where I can eat out every night, and there's not really a lot more I can eat, and there's not that much that I really want to buy that would bring me happiness. So it's really more a question of, "Do I have the problems I want? Am I working with people that I enjoy working with? Am I making a difference in terms of just stopping bad guys, helping people protect their data?" Those are the things that get me out of bed in the morning and make me love going into work and working on these kinds of systems.
McGraw: So now a really easy question. Does P equal NP?
Kocher: If N equals one, it does.
McGraw: That's a cheating answer. You know that N doesn't equal one.
Kocher: I firmly, intuitively, have a conviction that the answer is no. I would be delighted to be shown wrong, but of the things that keep me awake at night, fundamental mathematical breakthroughs are way at the bottom of the list, along with quantum computing. Software bugs, implementation defects, operator errors, bad user interfaces, misuse of algorithms, malicious guys in manufacturing facilities—these are the things that you lie awake worrying about. If you quantify the risks, the likelihood that some just incredible mathematical breakthrough is going to cause all of our ciphers to collapse is really infinitesimally small. The likelihood of their implementation of defects in a broad range of products is a simple fact that we know is true, so it's just a question of trying to find them and cope with them and deal with them.
McGraw: I couldn't agree more.
Kocher: It's a good question. But if anybody's lying awake at night worried that somebody's going to break AES-256, and not worried about the quality of their hardware and software implementations or side-channel attacks and so forth, they've completely missed the boat.
McGraw: That's true, but you did wax philosophical. I was just hoping for a yes or a no.
Kocher: When you interview cryptographers, you should know better than to expect a yes or no answer.
Show links, notes, and an online discussion can be found on the Silver Bullet webpage at www.cigital.com/silverbullet. See the full text of this interview at www.computer.org/cms/Computer.org/dl/mags/sp/2011/01/extras/msp2011010008s.pdf.
Gary McGraw is Cigital's chief technology officer. He's the author of Exploiting Online Games (Addison-Wesley, 2007), Software Security: Building Security In (Addison-Wesley, 2006), and seven other books. McGraw has a BA in philosophy from the University of Virginia and a dual PhD in computer science and cognitive science from Indiana University. Contact him at firstname.lastname@example.org.