Safety Radar on Oil and Gas Startups

0:00 All right. What's going on? Oil and gas start up. We got safety radar. We got Garrison and Matt on the show today. Garrison, we were just talking about how we got to introduce through Chrissy,

0:11 and you're telling me, and then I interrupted you. I was like, no, no, no, no. We got a, this is good podcast content. We'll record it. So how do you, how did you know Chrissy? Yeah. So

0:20 go see for everyone. She was at Capital Factory, formerly. She just left recently, and then she started, started her own thing, huh? Yeah, she's working, I think she's working for a startup

0:31 out of California now, but long, a bunch of connections to Chrissy and her husband, a guy named Will Edwards, who founded an aerospace company. But I think the most direct connection is through

0:44 Capital Factory. I'm an army reserve innovation officer. So one of the things that I do, you know, one week in a month, two weeks a year, is go out and scout new technologies that may have some

0:55 applicability in the defense sector. And Chris used to run their federal program. So she'd go out, look for new tech companies that they could bring into capital factory. And she was showing me

1:07 some of them. And then she was like, Hey, I talked with Colin from Digital Wildcatters. Don't know if you heard of it. I'm like, yeah. That's awesome. So yeah, that's what she got.

1:17 I've known Wilfer, I don't know, probably a little over a year now. And I got introduced to Will through one of his VCs and Will and I hit it off

1:27 and he knew about DW as well when I met him. And anyways, I found out that they're doing a lot of their rocket testing for Firehawk out in Midland in Midland's my hometown. And so, you know, I

1:35 just thought that was cool. And so Will and I became friends. And then, you know, he's getting some, some pushback from some local politicians on West Texas. And, you know, I was dunking on

1:47 him on on LinkedIn, firehucks back. And anyways, Chrissy, I can't remember what it was specifically, but Chrissy had posted something or commented on one of firehucks posts.

1:60 And I clicked on our profile, so our background. And I was like, hey, Chrissy, looks like we have some mutual friends. And I said, like, Will, and I didn't put two and two together, they had

2:07 the same last name. She's like, yeah, Will's my husband. And so I thought that I just thought that was funny. I was like, oh, yeah, if I wanted to take in two seconds, I probably could have

2:15 mapped that out. But yeah, they're great. I love them. I have one more cool tie there. And this is maybe my favorite one. I got to officiate their wedding. Oh, really? Yeah. Yeah. I will

2:26 just caveat this with I have zero credentials to do something like that. But they're like, no, it's a friends and family thing. I'm like, well, OK, I don't think I can pronounce you man and

2:35 wife under any legal circumstances. But I am happy to do this. And they're like, yeah, we already got the paper. I hate the credential system. And I think if anything, you can officiate a

2:44 wedding. Exactly. I can do that for sure. I talk a lot and host things I could - I can appreciate the wedding. Great side pick. So let's talk about y'all's company real quick. First, Garrison,

2:58 you live in Tulsa, already told us a little bit about your military experience. And then before we're on the podcast, homies spent time in Iraq. And

3:09 so that's great. Matt, you're in Denver? Yeah, that's correct. Okay, cool. You're from Denver originally? No, actually from Oklahoma. So we're at worked in oil and gas up there Went to

3:19 school undergrad in Norman. Okay. We were up to Tulsa to work for Apache. Okay. And then worked for a couple EMPs up there. Garrison and I worked together in Tulsa and then worked took me to

3:28 Denver and - Okay, cool. I kinda landed there that way. Did you go to school for engineering at OU or Environmental Science? Environmental Science, okay, cool. And then did work for Apache, so

3:39 oil and gas guy. Oklahoma's cooler than Denver, so we had docs and points for me for living in Denver, but it's a hot

3:48 time. It's got its pros and cons to both. Our CFO just moved from Houston to Denver, and he's got two young daughters. He's like, Yeah, we walked to the 4th of July parade, and we didn't sweat.

3:59 And he's like, It was so nice. And so Denver's got good things, but I got to bash. I got to bash you. Just because. Got to do it. Fair enough Garris gives me

4:13 a hard time about it all the time. So give me the 40, 000 foot view on what the company is and what you guys do. Yeah. Well, so maybe a good place to start is with that military background. So

4:23 as you know, I'm an army guy, went to West Point for undergrad, and then deployed with the first cavalry division. I'm an artillery officer. The home of the artillery is Fort Sill in Oklahoma.

4:36 And when we were getting ready for deployment, this was like 2011 timeframe. but when we were getting ready for deployment in the weeks leading up to that deployment, like literally like six weeks,

4:48 we lost multiple people to training accidents. And so when you actually went back and look, I mean, these are people who are getting killed, like we're going out for training, we're prepping

4:59 equipment for deployment, people are getting killed on the job. And you just, when you go to a West Point and you're thinking like, okay, I'm gonna go to combat, you expect that you're gonna

5:10 have casualties in a war zone. You don't expect to have casualties from training accidents and somebody loading equipment that they weren't trained to load and then they fall off and they pass away.

5:23 I mean, this was just stuff that when you look at it after the fact, you're like, there's gotta be a better way to manage these risks. So that started the gears turning. Then you actually go

5:33 overseas and you see, okay, people are getting injured, killed in combat, but they're also getting injured and killed by accidents. This is like the ever-present enemy. The way that we manage

5:45 those risks in the army is primarily paper. It's these packets, these risk management worksheets. You can do a lot with that from a training perspective, from a processing perspective. It's just

5:57 like, you know, the stack of paper grows. You tuck it away. You don't go back and check it until something bad has happened. When I moved into the energy industry, Matt and I were working

6:07 together for an EP at the time, we saw the same thing. So, you know, it's either maybe it's fillable PDFs. Maybe there are some hazard collection forms, but there's really no way to get that

6:19 information through the bottleneck of the human, determine signal from noise and figure out what are the causal factors? What are the things that we need to train people on? So that's where Matt

6:30 and I started talking about this initially years ago, where like, there's got to be a better way for us to do this. I ended up moving into the pipeline industry and we saw the same thing. And at

6:40 that point, I was in grad school at the time, I was going to University of Pennsylvania and I called up Matt, I was like, Matt, we have got to spend some time figuring out how to do this better.

6:51 And so we started designing a system, we came up, we say really high level, we say we've got three things that we do. The first is we collect information in a way AI can use it. So whether that's,

7:03 we have an app, so everybody in the workforce can have the reporting tools in their pocket, the training tools in their pocket go straight to the cell phone. That's the first thing. But if they

7:14 already have a reporting system, we can just tap into the information that they're already collecting. So if your people are already trained, we'll just connect into that via API, your data lake,

7:25 different things like that. First thing that we do is collect. Second thing we do is analyze. And with the analysis, we've trained,

7:34 different AI models and we put them together in a proprietary platform that we built where the AI is working through trained off of the company safety perspective. So it thinks like a safety person

7:45 does assessment for actual severity, potential severity, causal factors, potential serious injury and fatality exposure. And it assesses that, pulls out all of those risk signatures. And you can

7:57 see that in real time because it's able to process these reports in seconds where it would take a person minutes to hours to do some of this processing. So those are the first two things. The last

8:08 thing is just the communication piece, live risk dashboard, alerting based on different thresholds, causes, as these different factors, some will co-vary together. So one factor servicing of

8:21 energized equipment that in the historical data, it correlates with a higher exposure to potential serious injury and fatality. If that moves with some other factors, your exposure risk goes up

8:33 really fast. So the AI is looking for those, extracting those, triggering alerts to the operations team and safety team. So that's really what we do at high level. I love it. One thing that kind

8:42 of stood out to me is you're talking about just past experiences as deaths in the military.

8:50 One, overseas in combat areas, like that wasn't even a thought to me that hey, there's deaths that aren't combat related, but I mean, it makes perfect sense I mean, you're in a dangerous

8:59 industrial-type environment with heavy equipment and things of this nature. And so I don't know why I've never thought of that, but yeah, it's like I could see a lot of accidents happening. And

9:10 then yeah, anytime you hear, you see a headline of someone dying on base or on a training ground in the United States, just like how does that stuff happen? And I mean, look, they're accidents

9:26 for a reason, right? when I look at like my experience in the oil and gas industry,

9:34 you go out there, you pencil whip a JSA, and I was in the oil field before stop cards existed, and then they introduced them, I'm like, guys, this is like the most virtue signaling bullshit.

9:46 Like, this is not actually, like, I'd sit there, like, they'd be like, you have to turn in, you know, two stop cards a day, and like, you sit there, and like, you make up shit. Yeah.

9:55 Yeah. To turn something in, right? And so anyways, there's a lot of, there's actually a lot, and I've been wanting to go on this rant for a long time, but there's a lot in safety culture that's

10:06 just theater. Yeah, for sure. And it doesn't actually make a meaningful impact, you know. My second month on a rig, my motor man got sucked into the drawworks on our drilling rig. Oh, cool.

10:19 He's very fortunate, and just ripped off of his fingers, didn't sever his head. And, you know, I remember like impact gloves had just come out right after that. Like impact gloves aren't going

10:31 to keep you from getting your finger shipped off, but you know what would it stop that from happening? Lockout tag out and we didn't lock out tag out. I didn't even know what lockout tag out was.

10:39 And it's like if you focus on like the real inputs that actually matter. And so that you're not in that position in the first place. But love that we get to talk about AI. You know, we, I'm an AI

10:54 nerd and we've been building some pretty intricate retrieval augmented generation over here at DW as well. And like I think about HSC a lot and, um, and like what you guys are building, because if

11:08 you look at it, you know, just say that you're an oil and gas company, big oil and gas company out in the Gulf of Mexico. I mean, you've got decades worth of historical information, right? Yep

11:20 That's probably like not really useful to you in the state that it that it is hours, days, months to one go find it, to make any kind of correlation out of any of the information. But to your

11:35 point with AI, it's like, you do that really fucking quick. Absolutely. Yeah. I mean, going back over looking at causes, taking those causes and applying the potential severity of the events to

11:47 them. Now you're able to look and say, okay, what are our causes that are going to have the highest severity, the highest piece of exposure? And I mean, most of these companies your point, when

11:57 we find with our customers, they've got decades worth of data. And they've been collecting it. And they're like, look, we've been collecting it. It's for compliance. But really, we've, there

12:06 was an energy company that said, we feel like we are the Google of information in the energy industry. We have all of this stuff, but we've been waiting for this golden age where we can use that

12:18 and extract all the trends. We can go back and process hundreds of thousands and have processed hundreds of thousands of these reports and you can see for that company, we have a data scientist on

12:29 the team who's very good at going through, running the statistical correlations and saying, these are the things for this timeframe. We're starting to see potential severity ramp up. That means

12:39 your actual severity is gonna ramp up. We go back and track that over a year and we actually look, we can see the potential severity starts to rise then the actual severity of events starts to rise.

12:50 So it's like - Interesting. A lot of capability that people don't realize and it's actually, it's really easy to implement. Companies have way more data than they think they do. They just don't

12:60 realize. No, yeah. I mean,

13:04 like I was talking to the company a couple days ago and I'm like, well, we have all these documents going back 30, 30 years. Like, you know, it has a short shelf life because it's not valuable

13:14 to us a few years later, but like what you're telling us is that it can be valuable. 100 hours later. Yeah. And so, you know, when you look at it, you know, running all this regression

13:23 analysis and, um, you know, looking at, hey, we had these five safety incidents happen. What was the common ground between those? So doing that, are you guys doing any, like what you were

13:41 just describing there of talking about the risk increasing? It sounds like you guys are doing some prediction modeling in that as well. And we kind of stay away from predictive a little bit, but

13:54 proactive You know, we can look at that and say what risks are rising and you know, how much more likely are you at these types of better words? You can't predict an incident happening, right?

14:04 But being proactive about, hey, look, their risk exposures seems to be trending. Yeah, in a great example of how we can use that data, you know, the regression data to use that in a proactive

14:14 manner is the JSA, right? If you submit a digital JSA and we see, here's your work activities for the day, here's the work plan and look back across the data set and say, here's all the accents

14:23 have happened. You're doing this type of work. times more likely to have these types of exposures today on location. And if you're the frontline person, you're getting that in real time. It's

14:32 like, hey, like, you know, the smash fingers happen when you're nippling up the BOB in the last five incidents that you've had have happened during that activity. So yeah, yeah, that's

14:43 interesting. Just just to that point, Colin, one of the other things, you know, you just mentioned smashed fingers, BOP, nippling up like those are, I mean, those are some jargon terms. Our

14:53 industry is jargon heavy. I mean, you know, you're on a rig. Everything's got a name that is not in a textbook. Yeah. We've, and some of them, you can't even say on the show, so yeah.

15:06 Are you sure? We can't.

15:09 We saw it on the Hollywood one. There's some risk of getting canceled over on the street. Yeah, quick way to turn off customers Okay, all right. We won't say those. Yeah, you, you know,

15:19 there's, I think there's a benefit to AI being built by energy people. in the energy industry. When you think about where a lot of the stuff is coming from and even looking at the traditional

15:31 safety systems out there, the form collection systems and things like that, they're still coming from the coast, which is great. But if this AI that's supposed to analyze risk in the energy

15:42 industry is being built in San Francisco or Palo Alto or Washington State, there's a disconnect. And that's one of the reasons that we felt like, we looked at potentially headquartered on the coast

15:52 We've been, I've lived in the mid-con for over a decade. Matt was born and raised there. We've worked in an energy for over a decade. I mean, I guess, gosh, well over a decade for Matt. But

16:05 when you look at like what our customers want and the jargon and the understanding that has to go along with these analyses, it's better to have that coming from our industry. And that's something

16:17 that we really pride ourselves in. It's having the energy back I mean, look, um, Silicon Valley. One, even if they wanted to, they can't build for this industry because they don't understand

16:26 the nuance. They don't have the domain expertise, right? Yeah. They can contribute to the technology, right? Big time. Lots of smart people out there that can contribute to it. I mean, with

16:35 AI that we're building, I mean, we're standing on, you know, the backs of shoulders of big tech. For sure. And so, but I've always had a big belief that whether it's energy or any other, you

16:47 know, technical industry, the solutions are built by people that actually understand the problems from being boots on the ground and have the domain expertise. And even just like cultural alignment

16:58 with companies, you know, that's a huge one. Yeah. To, you know, just being able to sit across from the table and yeah, they've seen nipple up and he gets some woke motherfucker from Seattle.

17:12 Or did they just say nipple? I don't even know how to call it that. That is

17:18 what they call it. Every minute of this, you have. Yeah, that's so true. So for you guys, using the app, let's talk about kind of workflows, user experience. Like who's actually using this?

17:35 Obviously like you're in office, safety managers are

17:40 field hands using it as well as at the goals, just tell me about that. Yeah, yeah, right now, so the reporting, a lot of it is about the company's reporting culture. So if the field is

17:54 reporting near misses, good catches, JSA's, incidents, like all of the things that most good companies are tracking, like all of the customers that we work with, they're tracking that stuff in

18:05 the energy industry, we track that stuff. Whether they're reporting that over the app or not, when we work with them, we tap into that data. So the field is using it, whether they know it or not,

18:16 it's really light touch on the field. That said, there are companies who've gone, you know what, our current reporting system sucks. We've got to switch over to something mobile-friendly. We

18:27 took a lot of time to build our reporting software so that they can get into it on their phone, easy speech to text, like all of the stuff that makes it easy for somebody in the field to use. Yeah,

18:37 big chunky buttons. Yeah. Like some looks at it and they're like, really pouring on it. But it's so easy to use. Yeah, it's just, yeah. Intuitive. Point is to be big and simple. Yeah, it's

18:46 funny. I was actually just talking to my team about this. I did this three month course on HBS online and they did this entire case study on Amazon's data science team and all their AB testing on

18:59 their website because like if you look at Amazon's website, I mean like it looks pretty shitty, right? Sure. It's just like, it works. And it works. That thing is so fucking optimized. It's

19:12 kind of saying how much time and effort their data science team has put into that. And so it's not always about looking pretty. It's about what works. Yep, yep, yep. Yeah, that's what we say

19:21 too. And we've really focused on making it easy for the field team to use if they're doing the reporting, but if they're already trained on something, they can keep using that. At the end of the

19:33 day, the AI is gonna analyze what it's given, the modules that we've trained, we ingest their safety manuals, their operations manuals, from and their risk matrices, all of the stuff that the

19:46 safety person would look at to understand what's come across the system, that's what we're gonna use to extract the correlations. They can, like whether it's the ops team, the safety team, there

19:58 are a couple different sort of consumers of that data. The executive team likes to see, especially depending on the type of. tracking that we're showing them so that trending around like potential

20:09 severity, especially when you start talking about like equipment hazards and things like that. Like, okay, do we need to replace this particular type of equipment? Is this a company-wide push to

20:18 retrain on something? The executives like to see that stuff. So we do a lot of the dashboard building for them eventually. The end user of the dashboard is typically the safety team, the ops team

20:31 and the executive team, but we're starting to get to a point where we're building outputs like weekly reports for the supervisor or to your point, like when a company, if they have a work order

20:43 system, especially we work with some energy manufacturers, like equipment manufacturers, in a manufacturing operation, you've got a work order system. The AI can go back, we can extract all of

20:54 those past correlations and then as those new jobs are coming into the system, the AI can do the same assessment and go, Oh, this work is about to be done. These are the spots where we've seen

21:05 these types of incidents in the past. These are actually the most dangerous steps in the process. What's your forward looking risk curve look like? Like what of the work to be done in the next week?

21:15 Where are we gonna be seeing our risk go? And that can be fed right back to the field so they can see you're about to do so. You're about to do the most dangerous job on the company. Yeah, just

21:24 making them aware. You know, the most dangerous job on the oil field is lifting tubulars up to the rig floor And like every safety meeting, like I would make sure I tell every single crew that I

21:37 worked with that, like just make them aware. Like guys, this is the most dangerous activity. So stay out of line of fire, out of out of the strike zone and just making people aware in a cognizant.

21:48 And so yeah, having essentially a bot that feeds that to you and says, hey, look, you're about to do this activity. It's dangerous. So watch out for, you know, maybe and stuff. Yeah, you

21:58 know, especially, you know, in the rig scenario where you have a large workforce there you can communicate that easily. It makes a lot of sense to kind of verbally do that, but also like lone

22:05 worker scenarios. If you're a pumper, you don't really know the location.

22:10 If you're submitting your JSA and you can see, Hey, someone had these types of accidents on this, on

22:16 these types of facilities. It gives my heads up. Yeah, I think that actually reminds me of this really sad story. This is actually my wife's co-worker's husband, goes out to

22:28 a facility and dies from H2S And wife drives out there with her two kids because she hadn't heard from them. She gets out of the car. She dies as well. Thank God. The kids, for whatever reason,

22:41 stayed in the car and didn't get out. But I think it's like so many safety incidents come from

22:50 just not like, we get it in a groove, right? And we do this every day and you don't think about it. We get comfortable And so just simply being alerted to Hey, look,

23:06 you have increased risk exposure on this. And it's like, okay, well, that's on top of mind now. So

23:12 I'm gonna be thinking about it. And so, yeah, I think this is kind of going back to how we started talking about safety is that there's a lot of stuff out there that's safety theater that doesn't

23:23 matter. But it's about like, okay, what's the actual real risk that's going on out here and focusing on the inputs instead of trying to address outputs. When I talk about addressing outputs, I

23:36 get so funny, man, like these walls that I owned up in Tulsa. It's backwards, oil and gas. I could be out there, in my job, when I started roughnecking in 2010, I mean, this is like drilling

23:45 for RSP, legit company. We'd be out there, shorts, no shirt, flip flops, no hard hat, that's how we used to rough neck, right? And I'd own these wells and like, there's no overhead stuff.

23:57 I'd be out there with no hard hat. I don't have any coveralls and I post something to him. We'll be like, Where's your PPE? Like, What the fuck do I need a hard hat for? It's like a bird gunna

24:05 shit on my head. Like, What's up with this gear? Like, This could be like, Hey, I can fucking pinch my finger in this pump jackor something. You know, it's a total way. Anyways, it's about

24:14 focusing on like what the actual real risk of that operation is. Yeah. And being able to assess historically of what's happened. The complacency issue is interesting too. We had a customer that

24:27 came to us and they're kind of showing us through some of their safety data. And they're like, you know, you always think it's the Green Heart and that's the new people. Like new people on site,

24:33 high risk. Like we see high risk there and then it troughs and then about a year and a half, two years out, it pops back up because they've become complacent at that point. And they're showing us

24:42 some of their stats in one. Yeah, you can see that actually really interesting. That's actually really interesting. Yeah, I think Green Hats are funny because like, you know, usually if you're

24:50 getting hurt, it's because someone tells you to do something and you don't know any better, so you do it but also like the other side of his like green hit. green hats can tend to be super aware

25:01 because you're scared. I'll tell you what, Colin, that was me and

25:10 Iraq too. I remember going on our first patrol and it's like, you've got all of these things that they tell you to look out for at West Point. You train on this stuff, you go to National Training

25:19 Center, you train on it again. It's like, you know, look out for a pile of rocks by the side of the road. They hide IEDs in there. I remember going out of the gate for the first time on our

25:28 patrol It's my platoon. We're all out there. We're all looking around. We got a bunch of combat vets. It was my first deployment. And like 50 meters outside the gate, I see this pile of rocks

25:37 and I'm like, whoa, whoa, whoa, whoa, whoa, whoa. And I literally had my sergeant jumps on the line and he's like, hey, sir, they just stack those out here Wait, no, no, no, no, no, no,

25:48 no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no no, no, no, no, no,

25:48 no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no,

25:48 no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no

25:59 Real quick, before we wrap up here, are you

26:04 guys oil and gas specific, are you energy agnostic, are you working across any, because it sounds like you're doing some manufacturing as well? Yeah. So, I mean, the technology, and to me,

26:14 doesn't sound like it's specific to oil and gas. It can be deployed in any energy company. For sure. So, we're energy guys. I mean, most of us in the company, we have a few folks and especially

26:24 some advisors who come from industries like defense and aerospace manufacturing, but we're energy guys. So, we started in this industry. That's our beach head. Many of the models that we've built,

26:36 we've trained off of, you know, we're training them for the customers. We have a lot of energy customers, but we are starting. We just picked up a construction customer. We're talking with some

26:45 pretty notable aerospace companies and scoping up a pilot. That'll be a big one So we are out there, the framework supply, the models are very easy to tweak for different We should say for

26:59 different industries, but within energy right now, what we say is we are an energy safety company with ambitions. So we're looking and within that too. Yeah, within that too. And I would say

27:10 within that too, it's utilities as well. And I mean, you know, when you think about like the green energy transition, we don't get to achieve that unless we keep our people safe in the process.

27:21 So it's both. I mean, it's renewable energy, hydrogen pipelines, all of this stuff still applies. And at the end of the day, we just train the model for the company's specific use case. Got you.

27:32 Cool. And one thing too to mention is the support we've got from just oiling gas industry in general. You know, Garrison mentioned we're headquartered in Tulsa. Our first round of non-dilutive

27:42 funding came from an energy startup incubator in Tulsa, Rose Rock Bridge. Yeah. Yeah. Some of our first customers came directly from the funders of that program. That's awesome. That's

27:50 phenomenal. Yeah. Yeah. So it's the industry's just been really supportive of what we're doing. That's awesome. Yeah. Yeah, no, it's all very cool stuff. I think just what AI is enabling in

28:01 historic data sets and augmenting operations is super cool. So this is pretty fascinating to me. If anyone wants to find Joe's company, where do they find Joelette? What's the website?

28:17 wwwsafetyradarcom. You guys, jealous you guys got the. com. Oh yeah, that was one of the priorities when we were picking our name We have like 50 that have the. com. Yeah. All right, guys,

28:31 we would highly recommend linking up with this crew. Sounds like

28:35 they're working on some really cool stuff. So safetyradarcom, make sure to share this episode

28:42 with a friend as always. And we will catch you guys on the next episode.

Safety Radar on Oil and Gas Startups