Eugene: We have a mindset among too many that this is a technology problem that needs to be solved with technology, and it's not. Technology's important, but it's the people who build the technology, who use the technology, and who respond to the technology that are really involved here.
Caroline: From Cobalt at Home, this is "Humans of InfoSec," a show about real people, their work, and its impact on the information security industry. I am so pleased to be joined today by my friend and colleague, Eugene H. Spafford, known to some as Spaf. Spaf is a professor of computer sciences at Purdue University. He is the founder and executive director emeritus of the Center for Education and Research and Information Assurance and Security, also known as CERIAS, like we're serious about security. He's been working in computing as a student researcher, consultant, and professor for 44 years.
A lot of his work is at the very foundation of today's modern current security practice, including intrusion detection, firewalls, and whitelisting. His most recent work has been in cybersecurity policy, forensics, and free future threats. There is so much that I can say about Spaf's different titles, about his different awards. I am not gonna do that here because I want to use most of our airtime so that you can hear from Spaf himself, but look it up. It's very impressive. Having Gene as a guest is such a humbling experience. I have been so looking forward to recording this episode. Thank you, Spaf, for being here. It is such an honor to be with you today.
Eugene: Oh, I'm just delighted to be able to talk with you again.
Caroline: I was just telling Spaf that my in-laws are actually based really close to Purdue University, so I am hoping to schedule a trip to come out and visit in person sometime in the future.
Eugene: Well, we would love to have you visit. We enjoyed you speaking in our seminar series not too long ago. And next year is the 25th anniversary of the founding of CERIAS, and so we would encourage people to come visit us.
Caroline: That is really cool. A huge congratulations. I'm so glad to learn about that. Spaf, you, obviously, have so much knowledge and so much firsthand experience in cybersecurity. It's actually a perfectly fair and accurate statement to say that you have shaped much of the defenses that we use today. You've dedicated decades of your life to this field. What about it inspires you to continue contributing to this community?
Eugene: Well, I guess a kind of a flip answer to the way that you phrased the question is if I help shape where we are today, I'm trying to atone for that. We continue to have a lot of challenges, and the reason that I continue to work in this area is partly as an educator, I see the value in getting more people involved in the field, and that continues to energize me and excite me. It's also the case that we have a mindset among too many that this is a technology problem that needs to be solved with technology. And it's not. Technology's important, but it's the people who build the technology, who use the technology, and who respond to the technology that are really involved here. And so I don't see that as accepted as it should be. That is one of the things that encourages me to continue on, is to try to help continue to shape that conversation and that direction.
Caroline: That's very cool. You know, I myself, am kind of constantly thinking about different comparisons for the cybersecurity industry, and one of them that I like recently is the comparison of cybersecurity to a dance because there's actually all these different people involved. Let's keep going. Gene, many of the speakers on this show have actually shared kind of a common theme, which is that so many of the threats that we see today are actually very similar to what we saw in the 1990s, they just happened through a different attack vector. You happen to be a co-author of one of the first English-language technical books on computer viruses and malware in 1989. What do you think? Is it all kind of just the same, or how have things changed?
Eugene: I think if you look at it from an abstract-enough perspective, many of the threats are still the same. The landscape has changed. So 40 years ago, for instance, we were much more concerned with security of mainframes and group mini computers. So, some of the threats that we have are a little different and the machines back then were administered by better-trained professionals generally than the computing that we now have that is on a laptop or on a handheld smartphone by everybody who are facing these threats. So that's one difference.
A second difference is the immediacy of communication and the global scale of communication that didn't exist back then and has altered some of the kinds of threats that we are facing. And related to that is the underlying, I would think, a social and government perspective in that the motivations for most of what we were facing, again, let's just say 40 or 50 years ago, were funded intelligence agents, in some cases, but more likely to be hobbyists or vandals. And the nature now of organized crime, for example, and much greater nation-state involvement, not simply in intelligence collecting, but in influence operations, changing public discourse, damage of systems creates a different kind of threat environment. Many of the individual threats are similar, but the way they're wielded has changed.
Caroline: That is so interesting. And I'm really glad to learn about how you think about then versus now. You mentioned several different ways in which our technology and our lifestyles, and even some of the actors have changed over time. Do you feel like we've gotten any better at keeping data and people safe?
Eugene: It's a mixed set of results. We have some better tools available, we have a better understanding of how to protect some systems. What's hindered that a bit is the commercial pressure, both to move new features out into everyone's hands and the commercial pressures to collect more information to be used in marketing, for instance. The result is we know better how to protect information, but we don't use some of the technologies that we know how to build, either because they're expensive to build and they get in the way of pushing out new services and technology, or they get in the way of advertising. And now as we see growing interest in machine learning and AI that can consume huge amounts of data, there's an even greater push to collect that information that we would otherwise want to protect. So the trend, I think, has been that we've done better with understanding the technologies, but poorer about understanding where best to apply them.
Caroline: What a balanced and insightful perspective. Spaf, is there anything that we do in the security field now that might have felt like it was science fiction when you started your career in this field?
Eugene: Well, because I've been doing it for a long time, I would say most of what we do was science fiction at the time. So, the very notion of something that we accept as a matter, of course, now is smartphone that we would carry around that is a camera, address book, calculator, video device, email, client, and more was really science fiction back in the day of the systems that I first learned to use. We can take that further with talking about massive server farms, the cloud, although my PhD thesis was on a system-named clouds back in the 80s, which I like to point out to people that the cloud is not something new. And even quantum computing were completely imaginary science fiction concepts back then. The always connected, always on wireless connectivity, having an internet that's not only global terrestrially, but reaches to the space station and is even in use by Mars landers is mind-boggling even now was totally in the realm of science fiction back then.
But I think one of the things that's important about this is that, and I use this sometimes in talks, those technologies now are kind of so what? Okay. But what will we have in 40 years or 50 years from now? The people who are working in the field now who will be working then, what is it that they're doing to anticipate the changes, and technology, and capabilities that another few decades will bring? Instead of designing for simply two years from now, we really need to be giving some thought to those longer-term science fiction, if you will, developments, that are, undoubtedly, coming, but we're so busy trying to keep up with the pace of day-to-day that we aren't stepping back and really trying to consider how we're going to shape that future.
When I talk about this in audiences, I point out that if we look at science fiction, there's sort of two directions we can go, and one direction is the "Star Trek" future, and the other is the "Blade Runner" future. And the choices we're making now are the ones that are going to continue to guide us in one direction or the other. Not that those are the only directions, but we all have an ability to influence how the technology is going to be developed and used. So the question is, are we giving sufficient thought to how those changes are going to affect the field?
Caroline: Gosh, I am the mother of two fairly young children, and the technology that's normal in their world, certainly when I was a kid, was like science fiction. And it is so interesting to think about what's coming up in the future. And one of the things that I'm really hearing in your description and your reflections, I actually interpret it as an acknowledgment of both accountability and responsibility, but also hope and optimism. I happen to have a viewpoint, and I think I hear it shared by you as well, that we have kind of a massive responsibility to determine what happens in the future, and it really is up to us.
Eugene: It really is. And I'll give two quick examples. One is in the area of robotics. And we can certainly envision, and some of us have worked towards the idea of something like an Android, like a Mr. Data from "Star Trek" that can help us in environments where having that mechanical assistance could be helpful. But there's also the case that there are those who are trying to build robots to be used by the military, and there's concern over autonomous robots that have weapons with them. Are those going to be appropriately controlled, and how are those being used? Those are choices that need to be made by people, not by the technology.
Another one is in the area of AI and machine learning. We can think about mining that information to, for instance, develop better information on healthcare, and health trends, or social policy. But if it's only going to be used better for candidates to disrupt our lives in politics or to market as particular products, that isn't realizing the potential that it has. Those all come down to choices of people. And the name of your podcast, "Humans of InfoSec," we have a bigger role to play than simply designing the technology.
Caroline: I find myself feeling really inspired and really fired up by hearing your words. I do think that there's a call to action here for our listeners. And, Spaf, when I observe you interacting with students in the various roles that you have as an educator, I've heard you encourage us... I think that when I think about security, like sure there's tech, you know, but so much of it comes down to decisions and so much of decision-making comes down to power control and ethics. And so I do think there's an opportunity for those of us in the field and looking at the field to really recognize the importance of these aspects and to do what we can to influence things in a positive direction.
Eugene: Agreed. You asked me at the beginning of this what continues to inspire me, and it's really around these issues. I believe that being able to make informed choices is important. And if all we do is educate about the technology rather than about the implications, people won't be in a position to be able to make informed choices about their own use and about what they're designing. So this is an area where I think we have some of the greatest potential as people working in the field, is to try to help shape what comes next.
Caroline: Very cool. Gene, in addition to being an author, and an educator, and a technologist, you've also been a trusted advisor to many large-scale corporations and multiple branches of the U.S. government. I wonder, what are your thoughts on some of the threats that we're gonna face in the next couple of years?
Eugene: I think the biggest threat is the use of high-performance computing and algorithms to attempt to manipulate populations, whether it is false stories, altered images, the faked reality of videos and pictures, only some of which we're beginning to see now, the manipulation of information, the deep data mining of trends and psychological information, the modeling of human consciousness to shape people's perceptions and reactions for politics and commerce. We should be in the position where the technology is serving us rather than us being manipulated by the technology. And I believe based on what we've seen so far with, for instance, some of the meddling that the Russians have had in our social media and in much of the West, that we are easily victimized with false information. And if we're not careful, that will be incredibly disruptive of our society and our ability to make beneficial progress.
We could end up to a point potentially where we begin to distrust all technology. It's already happening in some respects, where we have people who are distrusting, for instance, vaccines which are well-tested and generally quite safe, especially compared to the alternatives. If people begin to start distrusting some of the computing technology, where do we go? What do we do? So I think that is the biggest threat, is the technology being turned against us to manipulate us. We don't have the guardrails or the experience in place yet to protect us against that. That's more than any technical threat such as password-guessing or malware, or anything like that. I think it's more the larger social trends that we're seeing with the ability to manipulate and present false information as if it's real.
Caroline: Yeah, that's downright terrifying. And certainly, from my perspective over the last several years, we've seen lots of different scenarios, both in the United States as well as internationally, where disinformation and questions about facts, and truth, and science are, I think, not something that we can take for granted that folks have a common and shared perspective on. I wonder to myself to what extent we as cybersecurity professionals have an opportunity to partner with colleagues and experts in different fields like political science, maybe even philosophy, maybe even ethics, certainly psychology. This stuff is downright terrifying, and I will be thinking about this for some time.
Eugene: Well, I agree. And I mentioned CERIAS, our center here, is 25 next year, was founded on this basic principle that cybersecurity is multidisciplinary. It is psychology, it is philosophy, it is political science, it is management, it is communication, all manner of other areas of research, including the technology. The phrase that I've often used, I've been quoted in many places, is that, "Without computers, we'd have no computer crime, but also, without people, we'd have no computer crime." And so if we only focus on the computers, we're missing a big component of the issues we have to address.
So I would certainly encourage anybody listening to this who's concerned about these issues, who are thinking about these issues to investigate the potential for collaboration with others. And maybe if we're getting involved in some of the organizations like ACM, ISSA, and others that also are multidisciplinary and looking at the broader issues because each one of us can play a role.
Caroline: Fantastic. Collaboration, multi-disciplinary. I certainly will personally be more on the lookout for those types of opportunities. Spaf, you're a security expert, you're a scientist, you're a professor, so many different roles that you play in supporting and contributing to our industry. Can you tell us a little bit more about what kind of work you do with your students? And I'm also curious to learn about what their perspectives are and what you can share with us about how some of these folks are thinking. Do you ever get surprised? Do they ever do or say anything that is unexpected?
Eugene: Well, those are two interesting questions. Let me answer the second one first. I would say, yes, they do come up with some surprises, or unexpected approaches to problems, and that's a good thing. I don't pretend that I've seen it all and have done it all. One of the little secrets about the more one learns, the more one is aware of the limits of knowledge that we acquire. So that's one of the gratifying experiences of working as an educator, is being exposed to people who don't have preconceived notions and are able to think outside those boxes and come up with new ideas.
As far as working with students, most recently, the students I've been working with, been in generally two areas. One area is in extension of work I've done all along in intrusion detection, but now we're looking at intrusion detection and intrusion response in operational technologies, real-time technologies beyond the traditional IT, information technologies that most people use, because those are hard environments to work in, especially the real-time.
There isn't a lot of extra processing time available to run algorithms and examine artifacts before making determinations. So there are challenges there about how we are able to detect tampering and misbehavior in real-time systems that affect safety and security. The second area I have several students working is in privacy, supporting kinds of technologies, measuring confidentiality, for instance, having a good metric for confidentiality, which we haven't had before, and we're making some progress in certain limited domains there.
Another student is working on developing a formal classification and evaluation methodology for anti-censorship software and systems that are used by people to get around censorship by oppressive regimes, for instance. The future is sort of uncertain. I give my students a lot of flexibility to identify topics that they find interesting. And I've got three new students who were just beginning to explore some concepts, but they're interested in both real-time and quantum computing. And so we may yet identify some good topics there. But it's basically encouraging them to explore and put forth ideas, and we toss those around and say, what's a problem that nobody's really talked about before, and how can we approach a solution? And that generally leads to interesting results.
Caroline: What fun it must be to be a student of yours.
Eugene: I'm not sure they'd all agree with that, but I try to make it a good experience because this is setting them a pathway for a career, for life, and I don't want them to look back at it with distaste. I want them to think about it as a really strong first step out into what they're going to do to change the world.
Caroline: That's amazing. I think there is a difference between fun and satisfying. And while I certainly like to have fun, I think that purpose, fulfillment, a sense of meaning, these are all things that can be deeply satisfying for us as humans, especially when it comes to our careers.
Eugene: Agreed.
Caroline: Spaf, I wish that I could keep talking to you for more hours, and hours, and hours about this stuff. For today, I have one final question for you, which is, what's coming up next for you?
Eugene: That's a question I'm always asking. I'm not going to be able to do this indefinitely. One of the things I haven't figured out how to quite stop or reverse is the passage of time, but I'm continuing to have fun at the things I'm doing and will continue to try to engage students in the classroom and in the research arena, and to try to engage organizations. I don't seem to be involved as much with companies and government agencies anymore because I'm not talking about the latest exploits and some of the things that I can distill and talk to them about fundamentals go contrary to the idea of the latest patch and the latest interesting new technology that people wanna put in place. But I am looking at the idea of how to convey some of this.
I've co-authored a new book that will be coming out, hopefully, by the end of the year that actually addresses some of the things I was talking about earlier. The book is entitled "Cybersecurity Myths and Misperceptions." And is intended to look at some of the human issues, the perceptual issues, cognitive issues that cause us to miss the mark with some of our cybersecurity and privacy efforts because we misunderstand things or misrepresent them. Maybe we heard them that way, or maybe they were true at one point, but we need to understand their limitations and whether they really apply. And I'll give you one quick example. We've all heard people say that users are the weakest link, and that's not true. Users can actually be one of our strongest defenses, but we have to empower them, we have to educate them, we have to give them the right tools and situation to exercise their agency.
And they can be our early warning system. They can be part of our defensive system if we understand that. But simply to accept that statement on its face is to dismiss that possibility. So the book is about those kinds of things, is we examine over 175 myths and misperceptions that we've identified in our experience. My co-authors, Leigh Metcalf at the CERT and Josiah Dykstra at the NSA, and taken our backgrounds and our ideas and distilled them down into this book that we hope will be helpful in changing some of the conversation in the field. So that's another thing that I'm working on. I'm not sure what next year will bring. I'm continuing to seek new opportunities, new horizons. My goal is to not become stale and to constantly find some new things where I can make a difference. So we'll see what happens.
Caroline: Phenomenal. I cannot wait to get my hands on that book. Spaf, what a pleasure it has been spending these moments with you today. Thank you so much for joining our podcast and sharing your knowledge and experience with all of our listeners.
Eugene: It's been my pleasure, and I hope maybe it's inspired one or two people to look at things in a different way. And I especially applaud you from the concept of looking at the human aspect of this and continuing to do the things that you do because that's where we make a difference, is the people.
Caroline: "Humans of InfoSec" is brought to you by Cobalt, a Pentest as a Service company. You can find us on Twitter @humansofinfosec.