Issue No. 05 - Sept.-Oct. (2012 vol. 10)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MSP.2012.130
Gary McGraw , Cigital
Kay Connelly, at the time of this interview an associate professor of computer science at Indiana University and senior associate director of IU's Center for Applied Cybersecurity Research, describes why in situ usability study is important, the ETHOS living lab, and her advice to women interested in pursuing a career in computer science.
You won a best paper award at Ubicomp in 2007 for your work on usability. Why is watching how people actually use technology outside the lab important?
When you're talking about pervasive or ubiquitous computing, you're not talking about going to a computer and doing a specific, self-contained task. That's where usability labs are quite useful—looking to make sure a particular interface is constructed in such a way that people don't make too many errors. But when you take technology out into the real world and people interact with it without even realizing it, you really have to know how it's going to be used.
Does that cut against the grain of what most people think?
No, the community as a whole is moving in that direction. But it's difficult to decide when to use the lab and when to do in situ evaluation. Quite frankly, it's very time-consuming and expensive to deploy these things in the real world for any period of time, so when you can get away with the usability lab, you certainly want to.
Sometimes there's a disconnect when experts think about technology's use in the real world versus when real people think about it. Do you think that this disconnect is exhibited in privacy work as well?
Absolutely. A lot of the privacy literature defines the term very academically—privacy as property, privacy as autonomy—and that's not at all how regular people define it. Instead, they talk about privacy in terms of what they're trying to do, as a side effect that they don't necessarily think specifically about. I've worked with the elderly, trying to understand their concerns about putting technologies in their homes meant to help them live independently for longer. They talk about Big Brother and about not wanting people to always have a window into their homes, but a lot of it boils down to this: Is a particular system that we're presenting something they find useful? If it is, then they might accept the privacy invasion more readily.
You've won some important awards for your teaching, including the prestigious Trustees Teaching Award. How do you teach about privacy?
I haven't taught a privacy course, but I certainly embed privacy concepts in my graduate-level pervasive computing course. The way that I teach it in general is that my students have to design and build a system with real users, and privacy comes up in that context. Students have different perspectives, so there's always a lot of discussion. Computing, in general, is becoming much more intertwined in other areas. Psychology, social science, healthcare—we have to know a little bit about all of these things because computing affects all aspects of our lives now.
Does your teaching inform your research more than your research informs your teaching?
They go hand in hand, especially for my graduate-level courses. I always bring in a current research project and make it a semester-long project. I'll also have the class work on team-based projects, with maybe two of those going on to become full-fledged research projects.
I listened to your piece on NPR about cell phones and marveled at the naiveté of some of the other researchers in the story who actually want a device to know everything about them. It seems to me that this is an unbelievable invasion of privacy. Do you think most people are blind to that, or just researchers?
I think both. You certainly have individuals who are very privacy sensitive—they won't even use store loyalty cards because they don't want corporations knowing what they purchase. But those people are few and far between. If there's some sort of a utility, people are usually willing to give up their privacy because they don't have a good concept of how data can be aggregated and used against them. That's one of the things we're doing with the elderly people I mentioned earlier because we found that they didn't understand some of the threat models. We created video skits that make these kinds of threats clear—for example, Internet shopping that introduces price discrimination based on your past history. This is something they didn't realize, but once we made it clear, they became very concerned.
My college roommate was aware of the data aggregation risk, but he thought it would give him access to better junk mail, so he was cool with the whole thing.
But in some ways that's true, right? When you're on Amazon and it suggests books that you might like, it gets it right a lot of times. You're reaction is, "Wow, that's interesting."
Everybody's heard of HIPAA, although few can describe in concrete terms what it might have done for them besides extra paperwork at the doctor's office. Is government regulation of privacy a good or bad idea?
I think that it can be a good idea if you look at how it's done in Europe, for example, where you own your data. People can't collect your data for one reason and then go use it for another reason without your consent. Although there's a question about whether businesses can be productive under that model, it's better than what happens over here.
I think the problem with HIPAA is that it's a lot of regulation that wasn't necessarily well thought out, which results in you having to find a way to control your data to get your insurance reimbursed. You couldn't go into my doctor's office and try to get that information without resorting to some sort of a trick. Realistically, I have no idea who has access to my information because who all needs it in order for me to get my insurance reimbursements?
What do you think is the best way to impose privacy progress on recalcitrant corporations?
We have to have laws saying that they can't use the data for anything that they didn't get permission for, and then those laws have to be enforced. Right now, it's whatever they put in their privacy statement, which can change at any time without notification, by the way, so it means absolutely nothing.
I've had the opportunity to wander around your ETHOS laboratory. Can you explain two of your favorite projects from the lab, so that people get a feeling for the kind of work you're doing?
Some of them I can't talk about because they're under patent procedures right now, but one of my favorites is the presence clock. The idea is that you have two clocks paired over the Internet—one is in the home of an older person, and the other is in someone else's home, most likely an adult child. So, for example, my mom would have hers next to her chair in her sitting room, and I would have mine in my kitchen because we both spend a lot of time in those spaces. The clocks have motion sensors that light up depending on the user's presence, so when I get up in the morning, I can look and see what time my mom started getting up because maybe at 7 a.m., the light on my clock is of moderate intensity. I can kind of get a rhythm of her day that indicates, yes, she's up and she's having her coffee. It just allows me to know that she's up and about without me having to call her every morning to make sure that she's okay.
So the nonintrusive nature is what you're looking for?
Very nonintrusive. If I see that there's been no activity in the past 12 hours, and I didn't know my mom was going somewhere, I might pick up the phone and give her a call. A lot of older adults become socially isolated, so this technology works both ways—she can see when I'm in my kitchen fixing lunches for the kids and things like that. It gives both of us a sense of security, a feeling that someone is on the other side of "the line." It's an interesting project because when we first introduced it to older adults, most of them didn't get it and said, "I'd rather just pick up the phone." But after we actually deployed it, they didn't want to give it back. They just didn't realize how much of a sense of security and connection to their child it would give them.
Did you let them keep it?
We didn't because it wasn't at the stage where it could work long term. We would love to, so we're working on prototypes that can be deployed next year that are much more robust. It will be available in the near future.
Do you have another project you can explain?
Sure. The portal monitor was actually designed by my colleague Jean Camp. The idea behind it is that whenever someone either rings the doorbell or opens the front door, a series of quick snapshots is taken and sent via a multimedia text to an adult child or a caregiver. If you have an older adult who's alone, and you're worried about people preying on him or her in some way, you can get a text and go through the three pictures and see who's there. You can immediately see, "Oh, that's a neighbor," or "Wow, this is someone I don't recognize. Maybe I should give my mom a call and make sure that she's okay."
Is that sort of preying on the elderly becoming more common?
I don't know that it's becoming more common, but it has always existed. You always hear of the contractor who convinces the older adult he needs a new roof when really he doesn't. There's a lot of financial fraud that happens in person, but a recent case made national headlines of a woman who was kidnapped. The portal monitor could have alerted authorities a lot sooner about the possible abduction and what the abductor looked like.
You're committed to forwarding women in science and in computing as well. Why are there so few women in computer science?
That's a question that a lot of people have studied, and we still don't have a definitive answer. But certainly we know that there's an image problem, especially when you talk to young children about what a computer scientist looks like—the responses are stereotypical, that the person is geeky, nerdy, wears glasses, and works in front of a computer all day. So we lose a lot of people—male and female—who don't want to be the nerd and don't see themselves as that kind of a person. In a lot of ways, we aren't doing a good job saying what all you do as a computing professional. I program, and I have programming skills, but a lot of what I do is interact with people to figure out what kind of technology would make sense in their everyday lives. I have a lot of social skills as well.
What advice would you give to a young woman who's interested in pursuing computer science as a career?
It depends on her interests, so I don't have a speech on "here's how to make it as a woman," because I don't know. I just don't.
You have a data sample of one.
No, about half of the graduate students in my lab are women, which is a very high number for this area. But my work tends to attract a lot of women because it's in the health arena. Research has shown that women tend to pick professions where they can make a difference in people's lives, and this area is very hands-on. You're actually working with, say, dialysis patients or the elderly, and you immediately see the impact that you're making. Computing and what is now being called informatics is wide open in terms of what you can combine with computing skills to do different things. It's good to get that foundational knowledge so that you can be a bridge person between the people who are only, say, biology and computer science. You can be that bridge, and those bridge people are very valuable.
The book I'm currently reading is called The Song Is You by the guy who wrote Prague. I know you share my love of fiction, so what are you currently reading?
I'm actually reading essays in the form of I Was Told There'd Be Cake. It's something that I can read in small chunks. Having small children at home over the summer, I can't get involved in something very long right now.
The Silver Bullet Podcast with Gary McGraw is cosponsored by Cigital and this magazine and is syndicated by SearchSecurity.
Gary McGraw is Cigital's chief technology officer. He's the author Software Security: Building Security In (Addison-Wesley, 2006) and eight other books. McGraw has a BA in philosophy from the University of Virginia and a dual PhD in computer science and cognitive science from Indiana University. Contact him at firstname.lastname@example.org.