Marnie: I learned so much from the team there as well. I think one of the things about security that really is interesting to me and I know a lot of my colleagues too, is it's constantly changing, right? And technology constantly changes. And so being a technologist in general, can be exciting because you get to learn different things.
Caroline: From Cobalt.io, this is "Humans of InfoSec," a show about real people, their work and its impact on the information security industry. Today, I am so excited to welcome Marnie Wilking. Marnie has more than 15 years of technical and managerial experience in information security and multidisciplinary risk management programs across a variety of industries. Before Wayfair, Marnie was global CISO for Orion Health, Senior Director of Security Compliance for Early Earning and Information Security Officer for Wells Fargo's mortgage division. Sidenote, also had a mortgage at Wells Fargo. Welcome, Marnie. I'm so happy to have you on the show with me today.
Marnie: Thank you Caroline. I'm so excited to talk with you. Thank you for having me.
Caroline: It is my pleasure. So this podcast is about people. It's about careers. And I would love to learn more about your background and your journey in cybersecurity, where did you start out? How did you become interested in becoming a CISO?
Marnie: Good question. As I grew up, I was very good at math. And when I hit high school, one of my math teachers kept saying, "You should go major in math." And I said, "But I don't know what I want to do." And he said, "It doesn't matter. Like, if you go major in math, you're gonna have lots of opportunities open for you." So I did. And some of the opportunities that opened up, obviously, are you know, teaching, and research, and actuarial, and I didn't necessarily wanna do any of those things. I, also as a math major, had to take a computer course. And at that point in time, any programming classes that we took all had to be done in the lab.
And I had taken some programming courses in middle school and high school also, but having to go to the lab really turned me off. I mean, having to slog through the snow at like 11:00 at night to get to the VAX machine was not so awesome. So I swore I was never gonna do anything with computers. And then promptly got recruited by one of the big four consulting firms, who sent me to coding camp with everybody else who was a new hire.
So, I did do application development for a number of years. But when I was with the consulting firm, one of the projects that we were on involved credit cards, and testing credit card functionality. And at one point, the testbed of credit cards that we were using, one of those numbers got used for a real transaction. So it was a fraudulent transaction. And I got pulled into the investigation. And it was really interesting. Fast forward a couple of years, I was looking for a new role and sort of fell into a job at a very newly formed cryptography services division at Wells Fargo. They were looking for somebody who had mainframe and CICS experience and, you know, banking, and that's what I had.
And so, one of my first projects was implementing encryption on the mainframe, which was very fun. So it had like, just the right amount of math and sort of, paranoia and investigation. That was really interesting to me. So from there, I did a lot of things at Wells. For the most part, I moved around to a lot of different groups within Wells, which is honestly one of the benefits of being in a really large organization, right? The security organization at Wells Fargo is big, and there are a lot of silos, but that actually can be good, particularly when you're starting out in your career because it really does give you a chance to have a lot of internal mobility, but try out a lot of different functions. And so, I had the opportunity to do everything from implementing encryption on the mainframe, driving encryption to laptops and desktops, implementing controls, assessing controls, writing policy, running security awareness and training. And then the last several years, I was the security officer at the mortgage division.
I don't know that there was a point in time when I suddenly said, "I'm going to be a CISO." But there was a point in time when I said, "You know what? I have a lot of skills that I've acquired from all of these different roles. And I feel like I have a pretty good view from a higher level of what's happening and how that applies to business." And I had the benefit of shifting to Early Warning which is actually owned by seven of the biggest banks. So when I left Wells and went to Early Warning, I really didn't leave the family. But they were really looking at the time at new technologies. And the banks at the time, were not really cloud forward at all, and very pushing back against trying to move into the cloud. But Early Warning was really looking at how could they leverage cloud technologies in their own data center. And so I got to learn about a lot of new technologies, which, at that point in time really sparked my interest: wow, things are really changing. And this cloud thing has really shifted the whole paradigm of how we need to think about security. And these things that we've been doing for the last 10, 15 years, we can't keep doing them the same way anymore. And so I could be a lifelong college student, if I had the sufficient funds, I love learning. Learning about all these new things was very exciting for me. And so being able to take that and all of that knowledge, and then apply it at Orion Health was really rewarding. And I learned so much from the team there as well.
I think one of the things about security that really is interesting to me, and I know a lot of my colleagues too, is it's constantly changing, right? And technology constantly changes. And so being a technologist in general, can be exciting because you get to learn different things. But as a security practitioner, you're expected to be an expert on security, about a lot of different things, right? From iOS, to cloud, to laptops, to databases, you know, all of these things. I think it's really fun to be able to explore all of those different aspects. And I know that there are people who dive in and become those security experts on those different technologies. And I love working with those people who have such passion about those things.
Caroline: Amazing. I think there is so much to be said for having come up through the technical ranks, having been a software developer, and then working on different security controls and projects. And having that ability to deep dive into the technical aspects, sort of if and when that suits you. But certainly, what I observe about being a CISO is that it is a totally different higher level thing. I think, because security is so multi-dimensional, we find ourselves using a lot of analogies in this industry. And I think that a lot of the analogies that folks use are not quite right. So I don't think security is a vitamin, or a painkiller, or a Band-Aid. I don't think it's something you can inject. I actually think security has always been a result, it's always been an outcome of decisions and actions that are made by many different people and having to do with many different infrastructures and technologies.
So, the best analogy that I've actually been able to come up with is security as a dance, which is unusual. Something that I think is really cool about your career is that you have not been a CISO purely in the financial sector, you've not been a CISO purely in healthcare or purely in retail, you've actually worked in these different industries. And I wonder what it's like, from your perspective, if the CISO role is different, depending on what sector you're in?
Marnie: It's an interesting question. Yes, I think so. Although I think at its core, our job as CISO is to understand the business risks associated with security, inform the business and figuring out how to reduce that friction so that the business can be successful, while also protecting our customers and our company information. But I think how you go about doing that differs depending on the industry, you know, being financial services is very risk-based all the time. They speak risk, because they've come up through financial risk, and market risks, and things like that. And so, security is another risk. And generally, because they speak that language of risk, it's a little bit easier to put what's easier...how do I say this? It's you can put security in those financial risk terms and they'll get that very quickly. Risk in other areas of the business, like in a technology business or a healthcare business, you have to figure out what their risk language is and use that language.
One of the things I have learned that I like to do, and I think it works well — hopefully it works well — instead of coming at the business and saying, "We have this risk, a cybersecurity risk we have to fix." And instead of doing that, going to them and saying, "I see this issue and I think it's a risk for these reasons. What do you think?" Right? So I think maybe there's a code vulnerability in a supplier-facing portal, right? So, I think there's a vulnerability there that has to be fixed. And it's a high priority, but because these are the bad things that could happen. But you tell me, partner, "Do you think that's a risk?" And what I found is that a lot of times there are things that I don't know about, right? I'm making assumptions about what the technology looks like, or what the processes look like, or, you know, other things that are happening. And I learn a lot when I ask the question in that way, instead of assuming that this is absolutely a risk, and it's a horrifying risk that we have to fix right this minute.
If I take the minute, a minute to ask the question and say what I'm worried about and ask them if they are worried about the same thing, I learn a lot from that conversation. And sometimes it's, "No, I'm not worried about that thing, because of all these other reasons." And suddenly, I'm like, "Oh, you're right. That's not our biggest problem." But sometimes they agree and say, "Wow, yeah, I'm worried about that, too." And then we get to have the conversation about "Okay, how would you like to fix this?" Right? "What do you think would work for your organization to fix this?" And then I get to learn about how their organization works as well. And I think having those conversations and keeping that dialogue open, and not making presumptions that I know more about the risks to their business than they do really helps solidify that relationship and become an actual business leader, as opposed to a CISO and "technology leader."
I think one of the ways that the CISO role has changed over the years... you mentioned, the technical background. I think there's still a really strong expectation of a technical background in a CISO and CIO. There's also though, a really heavy expectation of coming to the table as a business partner. And that's a big switch for a lot of people who have come up through the technical ranks. And so getting that training, and understanding how the business works, and asking the questions, honestly, tell me how your business works. Tell me how these processes work. Tell me where you're seeing friction that you think I might be able to help with. It helps you learn from a business perspective, helps you learn how to sit at the table with those business leaders. And also gives you a completely different perspective on how the business runs and what those risk tolerance levels were.
So going back to the genesis of the question around what's the difference between being a CISO in different organizations, it's really how they talk about risk and what those risk tolerance levels are. And you don't know that unless you ask the questions.
Caroline: I love it, Marnie. I think that curiosity is perhaps one of the most undervalued traits of the best CISOs. Because I agree with you, I do think that there is an expectation of both a technical background as well as the ability to be a strong business partner. And I think the only way that any of us get there is via curiosity and by humble learning. Marnie, you mentioned you love learning, and that you could be a lifelong college student. I wonder what kind of books have you been reading lately? And what kind of courses have you been taking lately? And in general, how do you learn? How do you like to consume information? How do you like to stay up to date with changing trends?
Marnie: So, I don't read as much as I would really like to anymore. It's just hard to find the time. I take my information in smaller bits, a lot of podcasts. I love your podcast. Ellen Alfred [SP] has a great podcast also. Business podcasts, the "TED Business" podcast is great. The "Elevate" and HBR's "IdeaCast" are a couple that I really enjoy listening to. They interview a lot of women leaders, and so, lots of conversations about different parts of your career, different timeframes in your career, and just a lot of really good career information.
There is a book that I highly recommend for security folks, particularly security folks that are working in tech-forward, cloud-forward, agile- forward organizations: Accelerate is the name of the book. It's really about CI/CD development practices. But there are elements that we need to be applying to security, right? So if a web team wants to deploy code 200 times a day, from a security standpoint, I need to be able to support that. And so understanding, you know, what is it that makes a high-performing development team can also help you understand what makes a high-performing security team.
We are in a place where I think we can't keep throwing up gates, which I think is historically how we got the attention we needed, right? But when deployments and development was more linear, that made more sense, but it's not linear at all anymore. And so, guardrails as opposed to gates and making sure people are informed and armed with the information. You used the analogy of a dance earlier. One of the analogies I like to use about security is 40 years ago or more, there might have been one person in the office who knew how to run "the computer" or even longer ago, there might have been one person who knew how to type really well.
Now, we all carry computers in our pockets, right? Everybody knows how to "run the computer." We need that to happen with security, we need security information to be as ubiquitous as computers in our pockets. And the only way we can do that is to make sure that we are pushing that information out to our developers as much as possible, as opposed to just stopping them and telling them they're doing things wrong. And we can't necessarily always guarantee that we're gonna catch everything from a security standpoint, but that's okay, right? If we give people the expectations, and then we decide what are the things that are the most important to us, and to the business, to make sure that we are keeping the risk within our agreed upon tolerance, then we can put those checks in place for just those things. We don't need to have checks in place for absolutely everything.
But I do think, I do really highly recommend that book. And there is a really great chapter actually on transformational leadership, which I highly recommend. I've given a talk around being a leader in security, and what you need to do. But it's not just about being a leader in security, although I tie it tightly back to security, but you know, customer service, right? The questions that I was talking about earlier, making sure that you're talking with your business partner and asking questions, and you're providing a service to them, collaboration, innovation, communication context, making sure everybody understands the big picture and what not just the security vision is, but what's the vision for the business and how does security fit into that, so that everybody understands what that looks like, so they can have those better conversations with their business partners, too.
One of the examples I like to use and that one is we've come a long way from saying "no," to saying "no, but..." We can't keep saying "no, but," and we can't even say "yes, but." We need to say, "How can I help you do this?" Right? "And how can I help you do this in a secure way?" The example I give and then I make everybody tell me how they're gonna say "yes" is, you know, your business partner comes to you and says, "We're gonna build a data center on the moon and put all of our databases there." And your answer is... it cannot be "no." Right? Your answer has to be, "That sounds like fun. Let's figure out how to do that."
Caroline: I love it. And I have in my home, a hard copy of "Accelerate: The Science of DevOps." But I do think that we, as a security industry, have a lot to learn if we just look over adjacent at software development. One of the things that I've been thinking about lately is how we, as an industry, got a new OWASP Top 10 last year in 2021. I took our industry's latest OWASP Top 10 and I put it side by side with the version that came out in 2003, the very first version. And if you look at those two, they're sort of alarmingly similar. And what that says to me is that as an industry, for about two decades, we have known exactly how to find, and fix, and prevent vulnerabilities, and yet they persist. So what are we doing wrong? Where's our opportunity?
Certainly, in 1998, a group of hackers went and spoke at the U.S. Senate and said, "Look, this is why the internet and software is insecure." Twenty years later, they went back and they said, "Yeah, you know, not much has changed." And yet, when we look at software development, a lot has changed in the last decade. We know that organizations that are good at DevOps have higher performance, you know, similar to the Accelerate book, but kind of in a different way. I'm a big fan of the "Phoenix Project" and the "Unicorn Project." One of the things about the Unicorn Project is it has these five ideals of DevOps, and I think we have an opportunity to really adopt those in security. So, the first ideal being locality and simplicity. The second being focus, flow, and joy. The third being improvement of daily work. The fourth being psychological safety. And the fifth, which is something you specifically mentioned, Marnie, customer focus. So that's just really cool. That's so cool that you brought that up.
Marnie: I think it's a really good point, too, because I had this sort of enlightening moment several years ago shortly after I read that book. I happened to be talking with another CISO, Julie Chiquillo, at the time, and we were talking about, you know, how do you actually apply that in security? And she said, "My entire security team is all developers." And I was like, "Oh, you're right. Like, how do you do security as a code?" You hire developers to do security, right? But it was one of those things that it seems so simple now when you think about it, but that's not how we've done security over time. And we've had technologists and practitioners that a lot of security folks don't have a development background. But I think it's going to be difficult for us to move forward using the DevOps model and security and doing security as code if we don't have security people who are developers.
Caroline: Yeah. How can you secure something, if you don't understand it? That makes a lot of sense to me. Marnie, switching gears, let's talk about crisis management. So this is something that CISOs deal with, I expect you have dealt with crisis management. Tell me what you think about crisis management, and how organizations should be thinking about it.
Marnie: So, I know you and I have both had plenty of experience with crisis management. So, I think a lot of people particularly again, you know, thinking about it as technologists, technology people tend to think of crisis management as disaster recovery from a tech standpoint and how we do IT disaster recovery. But crisis management is a lot more than that. Also in that vein, a lot of people think of crisis management is that how are you dealing with the crisis in the moment, but it starts way before that.
And so, I think one of the important things for companies to think about is, how are we planning for a crisis. And by planning for crisis I don't mean, you know, trying to plan for everything that could happen. You can't possibly do that. I once talked with a company that had their crisis management playbook. And it was enormous. And I said, "Gosh, why is this so big?" It literally, the back of it had different scenarios, and how to deal with different scenarios down to the squirrel-falls-into-a-transformer scenario, right? If that happens that often, sure. Maybe you probably need to playbook for that. But typically, that's not the level of detail you wanna get to. But you do want to know that you're gonna be able to have the right people in the room to make the right decisions.
Planning doesn't have to be at a really detailed level, but planning should be just, you know, "Do I have the right processes and plans in place? Do they tie together appropriately? So, do I know that my information security incident response plan ties into the global corporate incident response plan? And are those tied to our public relations team's crisis management communication plan, and our investment relation team's communication plan?" How do all those things tie together? And a good way to do that is to have a tabletop and, you know, again, I don't mean tabletop as in "Let's talk about this database going down or being ransomed -- and how do we technically deal with that and recover from it?" I mean, what are the business implications of that ransomware? What tools are suddenly not available? Which businesses can't run without that and who gets to make the decisions on what actually happens at that point?
And so running those tabletop exercises with the senior leaders in the room is super helpful, because they start to understand, "Oh, these are the things I'm gonna be asked, or these are the things I might have to think about, or these are the people I'm gonna have to involve in these discussions," right? You don't have to fix all the problems, but just going through the process and asking the questions and getting them to think about it really helps them exercise that resilience muscle. So that when the time comes, everybody knows where those plans are, everybody knows how to follow the plans. All the plans tie together and you can have a cohesive communication and decision-making process in the moment and then the follow-on communications after that.
I do not envy the PR teams' role during crisis management, but it is so important for us as security leaders. And you know, the technology leaders should all be plugged into what is PR's role, right? What is investment relations' role? What is legal's role? And make sure that they all know what those roles are. You can't have organizational resilience, if you haven't at least planned a little bit on how do you want to be resilient.
And then I think the last part of crisis management, you know, you've got the planning and you've got the "in the moment." The last part is what are you gonna learn from that? Not just how are you gonna recover and go back to business as usual. You need to go back to "business as better" based on what you've learned during that crisis. And that's really what makes organizations good at resilience and can make the difference between an organization that just sort of creeps along after a crisis and one that really comes back strong and demonstrates that they're able to adapt and change.
Caroline: Marnie, I agree. There's a difference between having a list where you iterate every possible thing that could go wrong, and having a state where if something does go wrong, people actually know what to do, who to call, and how to operate. Marnie, I wish that we could do this all day. And we know as podcast listeners, that 20 or 30 minutes is like the right amount of time for a podcast. Marnie, thank you again so much for being with me today. I've enjoyed this so much.
Marnie: Thank you so much. I always enjoy talking with you, Caroline. Thank you.
Caroline: "Humans of InfoSec" is brought to you by Cobalt, a pentest as a service company. You can find us on Twitter @humansofinfosec.