Kelly Shortridge: Welcome to another addition Between Two Kernels. Today, my guest is Joel Fulton, who is the author of The Battle of Frogs, which is a book about Marine warfare, and sadly not about frog warfare, it’s human warfare. Did I get that right?
Joel Fulton: You did. You did. I learned it through my experiences defending a small Amazonian village from a horde of frogs using only a hoe and a glass of dirty water.
Kelly Shortridge: I see. Okay.
Joel Fulton: It’s life changing.
Kelly Shortridge: Yeah, that sounds it. So, it sounds like that really prepared you for the notion of data lakes at Splunk.
Joel Fulton: It did. That, and my experience doing full contact origami. So together that became what was necessary to do data lakes.
Kelly Shortridge: When you have the reports from Splunk, you put it into origami, and present it to the board?
Joel Fulton: Safely, right? Without any edges exposed, but yeah, I think you got it in a nutshell.
Kelly Shortridge: Great.
Joel Fulton: What is a data lake?
Kelly Shortridge: That’s a good question. I tend to think of it more as a data swamp for the most part. I think the idea is that you want a data lake-
Joel Fulton: Fetid, festering, stagnate.
Kelly Shortridge: Yes, exactly. It’s just full of just stuff that you don’t want in there, and you’re trying to sift through to find something drinkable. So I think data Lake is kind of the shining ideal, but we don’t have it very often. What do you think?
Joel Fulton: I think we have mutually exclusive, hotly pursued goals. There’s huge value in all the data possible in my hands now. And that value, very specifically, is I want to ask questions of the data and don’t know the question to ask. So statistically, things like exploratory factor analysis. Why just look at p-value to certain relationships? To try to understand what questions aren’t I asking of the data? Maybe call that R&D. Exploratory, low risk, high yield, but infrequent high yield.
But production, defending the environment, I need to know exactly what data I have, why I have it, groom it, keep it current and protect it, reduce my legal liability for the exposure, for the retention of these data, all of that. And those are usually exclusive.
I think that James Bond doesn’t use every weapon that Q develops.
Kelly Shortridge: That’s true.
Joel Fulton: But Q has got to develop a lot of crazy weapons. And I think that we cause problems for ourselves by conflating the two. Q doesn’t fight, he’s the experimental side, so keep him locked away, so that when experiments go wrong and blow up, you limit the detonation. Maybe that’s how we should look at data lakes.
Kelly Shortridge: I don’t disagree with that. It also goes into the conspiracy theory that I’ve heard, is you left Splunk because your own Splunk bills were too high.
Joel Fulton: That’s neither conspiracy nor theory. Ironically, oddly, I didn’t have a license problem at Splunk. But, there’s a Stephen King story, and it’s macabre, and I’m not sorry for bringing it up, of a surgeon who was shipwrecked, and he’s on a deserted Island. And along with him, washes up a crate of medical supplies. He’s got no food, but he’s got morphine and surgical supplies. And so, as the story progresses, he begins amputating parts of his body and consuming them to keep himself alive. And as the story concludes, everything extraneous is gone, lips, ears, the whole bat. And he’s down to the index finger and thumb, because he can still hold a scalpel, and he’s trying to figure out what to do next. That’s why I came to Splunk.
Kelly Shortridge: I’m not sure if I totally follow. It does sound an awful lot like the infosec industry cannibalizing itself.
Joel Fulton: It does.
Kelly Shortridge: It does.
Joel Fulton: If you boiled it all down, if you take away everything, the one thing I want is my central nervous system, it’s my intelligence. If you take away … And protection. I could survive without it. It’s kind of like that shipwrecked marooned surgeon. I can survive without a lot. I don’t want to just survive, I want to-
Kelly Shortridge: Right.
Joel Fulton: But if you have to peel it all away, I want intelligence. And so that, to me, was why I went to Splunk.
Kelly Shortridge: Interesting. I’m not sure if that’s the best pitch I’ve heard, I’m not going to lie.
Joel Fulton: I told you it was macabre.
Kelly Shortridge: Yeah. That’s interesting. Yeah. Maybe you have a future career in marketing.
Joel Fulton: Probably not.
Kelly Shortridge: Probably not, yeah.
Joel Fulton: I don’t know. What do you think? No, she’s laughing. No, I don’t. Her job is safe.
Kelly Shortridge: You’ve spoken about documenting what you’re divesting in your security strategy. So my question is, how can we divest the RSA conference from our industry?
Joel Fulton: Why do you go to the RSA conference?
Kelly Shortridge: Mostly to meet fine people like yourself and connect with them. And it’s the most convenient meeting spot.
Joel Fulton: It’s been disappointing, hasn’t it?
Kelly Shortridge: It depends. Some of the people are okay. If I die because of coronavirus, then it definitely wouldn’t have been worth it.
Joel Fulton: I don’t go to the floor. And when I meet other folks and talk to them, I ask them, why are you here? I’m an introvert naturally, and so I don’t like the crush of crowds.
Kelly Shortridge: Same.
Joel Fulton: It’s probably not unusual, right?
Kelly Shortridge: Right.
Joel Fulton: That skill or that predilection tends to draw people like us to these industries. I think so, totally. RSA isn’t for me. Perhaps RSA isn’t for you. And I think there’s a mismatch there. I’m going to just say dishonesty, by which we have a lot of these conversations about products changing the world.
Kelly Shortridge: Revolutionizing.
Joel Fulton: Right? Now, I’m going to be all self aware and open to criticism, like you need that permission, I’m starting a products company. I firmly believe products wound employees. World War One, they learned if you shot to wound, you would take three soldiers out, because it took two to carry the wounded. If you shot to kill, you would only take one out.
Joel Fulton: Products, wound people. And so instead of being a risk manager, or instead of being somebody who’s superb at understanding the memory process of an operating system, now you’re an expert at CrowdStrike, or Carbon Black. That reduces your functionality, it reduces your intelligence, it drops your morale, and then you know we’re going to subsist in three years.
Joel Fulton: And so we take people who are excellent at surviving off the land, like Tarzan, and we put them in a suit and we make them sit behind a desk, and they used to swing with the monkeys and it was awesome.
Joel Fulton: I think that a lot of people look at it this way. We have a lot of conversations about how do we pick a better tool, or how do we rationalize tools? Where trade wants you to get rid of them. Or how do we do a better job of not letting tools drive the process?
Joel Fulton: I think the mistake we make is failing to acknowledge people matter more than technology.
Kelly Shortridge: Mm-hmm (affirmative), definitely. Processes too, I would say. A lot of people ignore it. You can have the best technology in the world, but if your process is garbage, guess what? You’re going to have garbage as an output.
Joel Fulton: Or a wonderful process that repeatedly drives nonsense.
Kelly Shortridge: Yes, that’s true.
Joel Fulton: If you automate stupidity, it proliferates.
Kelly Shortridge: So basically like every AI tool.
Joel Fulton: Is there an AI tool?
Kelly Shortridge: I mean, they certainly claim it. Actually, on another Between Two Kernels episode, we talked about AI pretty extensively. And the conclusion was it’s basically like a toddler that has a knife. It’s just not great.
Joel Fulton: That’s interesting. Wandering in the garage with your new BMW.
Kelly Shortridge: Yeah. Something like that. So, you’re kind of presenting this landscape, it sounds to me, like a survival horror RPG. Which actually goes perfectly with the fact that you talked about open unencumbered APIs, which certainly makes it sound like the security industry has some sort of inventory weight limit. So are we in the survival role playing game?
Joel Fulton: Cool. People talk about the short tenure CISOs have. And they do it with a hint of shame, or there’s a problem like how can I succeed? Why is the average tenure only 18 months? Like it’s concerning.
Joel Fulton: What are the hallmarks of a great CISO? How do you win as a CISO? I know how to win as a CFO. I know how to win as a CIO. I know how to win as a head of sales. How do you win as a CISO?
Joel Fulton: It’s the reciprocal of all of them. What does a great CISO look like? Well, there hasn’t been an incident. Well, does the CISO control that? Well, no. So what does a great CISO look like?
Joel Fulton: Yeah, I think many CISO roles are very much like a survival zombie RPG, where you’re standing alone on a hill, trying to shoot all those zombies. But, it turns out you’re not alone. And it turns out if you took the satellite view, there’s somebody really close to you on the hill who’s fighting zombies. And one of the beautiful things about being a CISO is I don’t compete with you. So you can be the CISO for my competitor, UPS or FedEx, but together as CISOs we can trade threat intelligence, hiring practices. And we can, with two of us on the hill, get each other’s back. When you get eight of us on a hill, and now it suddenly feels like a mini series, and not as desperate.
Kelly Shortridge: Yeah. It sounds an awful lot like Allen Alford when he was on Between Two Kernels talking about the distributed tier network, where it’s basically think about blockchain, but it’s CISOs, and they’re all kind of trying to strive together. And ideally the string of the network helps everyone else.
Joel Fulton: It’s amazing. When people matter more than tech … I just left the thing with a bunch of CISOs. Why are you here? Because this is the only time I know all the people that do my job are here, and it turns out I’m not crazy. It turns out I’m not alone.
Kelly Shortridge: But what if it’s all and shared delusion? What if all CISOs are crazy, and they’re all just kind of missing the point? So the reason why I ask is I often think that there’s a lot of overthinking that happens in security, where ultimately, yes, in theory you don’t want some sort of incident. But to me, looking, again, more to the upside, it’s how do you recover most quickly? And also, again, how do you enable the business? I think a lot of times we err too much on the side of security as this kind of very definitive, concrete, this is exactly-
Joel Fulton: Do you?
Kelly Shortridge: I do. I’ve seen a lot of CISOs who think that way, where it’s like, no, my job is to just say no as much as possible. And I think that’s delusional.
Joel Fulton: Interesting. I agree. I don’t think that’s appropriate in every circumstance. And I think those are the folks that burn out the fastest.
Kelly Shortridge: That makes sense.
Joel Fulton: Because there’s no such thing as achieving security. Somebody else, and I think it was at the event that you and I … commented that security is like health, or fitness perhaps. You never achieve it. You’re always working towards it, and that’s one of your pursuits. When are you done? Well, you’re going to reach an age where now you’re staving off loss. Your improvements are done at this point. That kind of feels like security.
Joel Fulton: At Splunk, since we started talking about Splunk, I was hired by the CFO there, and that’s a little different. That’s a different org structure, and so I had some trepidation. I met him, liked him, but then let’s see how the org works out.
Joel Fulton: So we sat down for an orienting conversation, and he starts telling me this story about golf. He’s a golfer, and I am obviously not a golfer, and so … I’m avidly not a golfer. So he’s telling me that he’s at whatever the hole is, and he putted, and the line was different. The ball went a different way than he was used to, because he knew it so well he expected it, and didn’t get what he expected.
Joel Fulton: And so, the groundskeeper was there, and he says to him, “Hey, this thing isn’t doing what it did. What changed?” And so this is a long story, but I’m wondering where this is going, just like you are right now. So you feel like you’re in the moment.
Kelly Shortridge: Yeah. Golf is just not like, honestly, the war and zombie metaphors were grabbing me a lot more a little bit.
Joel Fulton: So you feel me, right?
Kelly Shortridge: Yeah.
Joel Fulton: I was like, where is this going? And he says, “The groundskeeper says to me, the golf course is a living thing. It changes all the time.” I thought, okay, all right, kind of feels like a Mr. Miyagi thing, but okay. And then he stops. He’s done with the story. He turns to me and goes, “That’s what security is. I know you’re never going to be done.” But I thought, Holy smokes, he gets it. He gets it. It is the zombie apocalypse. You’re never done with zombies. They’re always coming. So knowing that, not deluding yourself, now what do you do about it?
Kelly Shortridge: The thing is, how is that different that other functions? Because I can tell you from not having come from security myself as a background, that’s pretty true across most functions. You’re never actually done.
Joel Fulton: Yes. But, there is a standard to which you can be held, and you can measure yourself and say to a third party, “See, I’m doing a good job.” You’ve got to be held to the same categories as your CFO.
Kelly Shortridge: The thing is, I think, like in a survival game, I think we just haven’t found the right metrics. I think, again, we’re trying to reach for those impossible metrics, like perfect prevention.
Joel Fulton: Good. We can’t have metrics.
Kelly Shortridge: Why?
Joel Fulton: Because to get the metrics, you need to have external validity. So, I am not going to tell you my measurements from Splunk when you’re not at Splunk, it’s inappropriate. Because of that lack of the ability to share that inside information, we can’t build an actual attainment. We don’t know what controls are effective regardless of the environment in which they’re employed. If we could, then we could realize things like a-
Kelly Shortridge: We shouldn’t have RSA as a conference.
Joel Fulton: And a lot of the people driving RSA, and the value there, might be completely irrelevant.
Kelly Shortridge: Yes. That’s honestly, my conspiracy theory here is that a lot of vendors specifically kind of drag people towards metrics. For example, types of malware found per month. Is that actually helpful? I don’t think so, but it’s certainly convenient for renewals, right?
Joel Fulton: Yeah.
Kelly Shortridge: Is it actually helping? I don’t know.
Joel Fulton: Yeah.
Kelly Shortridge: And as a final question, why is your startup in stealth? What makes you scared? I have a good idea, given the kind of flow of the conversation, with the war metaphors and zombies and so forth.
Joel Fulton: So you begged to question I’m scared. You said why are you scared?
Kelly Shortridge: In stealth, people are generally in stealth because they are worried about not being in stealth and revealing something or otherwise, or even hurting the business. So I guess on the flip side, why do think stealth benefits you?
Joel Fulton: Yeah. It’s intentional. I’ve had the humbling experience of having people decline a job. That didn’t happen at Splunk. And it didn’t happen so much it was easy for me to think it was me. Right? It had nothing to do with me. So having people say no, now it’s like, wait, maybe you didn’t understand.
Joel Fulton: Why in stealth? Because there is momentum that is possible, and it’s possible to waste it or use it appropriately. So, our intent and plan is that we have our debutante in Paris when we’ve got a couple of customers that have great things to say about us, we are ready on the product, and then we can announce and get the ball rolling. The ball rolling to increase traction with hires, and the subsequent customers.
Kelly Shortridge: Basically the idea is that the hype train is going up the hill right now, and you’re going to start blowing the horn once it’s fast tracked to funding town or something like that?
Joel Fulton: That’s pretty good. Sort of. I’m not a Facebook guy. You can’t go on Facebook and find out a lot about me. And that’s because, one, I’m not that interesting. And two, I’m a private person. Probably like most people. So when I want to share something, I wait for people to ask, and make sure that I know them, that I trust them. And that’s just a personal thing. I’m not better than others. It’s just my style and approach.
Joel Fulton: So as we’re doing this, it’s easy to talk a lot about we’re doing this, we’re amazing at that. What we’re building … And so here’s the trite spoken, but it’s meaningful, is by CISOs forces us. So we’ve got 16 CISOs that are helping us build this product. And our goal is to be transparent, like your Android app. You know what it costs you. You know what permissions it needs. You know how to turn it off. You know how it delivers value. And to get it that simple and that right, we got to wait because a lot of talking about it early … Why do I want to be RSA every day? If I have to do it, I want to do it once.
Kelly Shortridge: That’s fair. I still like the hype train idea. Or, for the people out there who wear makeup, like Glossier being transparent and natural.
Kelly Shortridge:Yeah, perfect. Thank you.