
The Calculus of IT
An exploration into the intricacies of creating, leading, and surviving IT in a corporation. Every week, Mike and I discuss new ways of thinking about the problems that impact IT Leaders. Additionally, we will explore today's technological advances and keep it in a fun, easy-listening format while having a few cocktails with friends. Stay current on all Calculus of IT happenings by visiting our website: www.thecoit.us. To watch the podcast recordings, visit our YouTube page at https://www.youtube.com/@thecalculusofit.
The Calculus of IT
Calculus of IT - Season 2 Episode 9 - Part 2 - Emerging Technologies and Their Impact on Autonomy
It's 2027.
You have just hired your new VP of Subdermal Biomodifications and she has spent the last four years constructing a self-designed, explicit LLM environment, complete with custom agent and rules structure that she has been evolving with tremendous success. She can't wait to bring it into your company so she, and her agentic twin, can continue her work.
I'll pause there.
Questions:
1) How would you deal with the inbound IP conflict?
2) How would this conflict with your internal governance for AI development, LLM ownership, and vendor platform lock-in?
3) How would you handle the attaching of her environment to yours? What would that tether look like and how would you secure it?
4) What would you do when she left to go to another company?
These are just some of the questions Mike, Kevin and I tried to tackle last night on the podcast on the topic of the loss of autonomy in IT Leadership. Looking into the future, even the near-term future, a few things (of many) become clear:
1) Any vendor, either platform or consultancy, trying to sell you an AI vision is going to miss the mark. This is because there is no mark. Save your money, focus inwards on addressing risk and compatibility and search for the answers.
2) There is no roadmap for delineation of "personal" or "portable" xAI vs corporate xAI and it would be nearly impossible to create one with the dynamism in AI development.
3) We may eventually need an AIL (AI Language) that allows for compatibility across xAI engines/xLMs.
4) Sure as shit, you are going to need internal portability governance not only in terms of your vendors but your staff as well.
Listen in to Episode 9, Part 2 of the Calculus of IT podcast where we discussed this stuff. We also took a stab at Quantum computing, the increasing demands on risk management, the future of edge computing, and the vendor relationship evolution. Each with their own separate impacts on autonomy in IT Leadership. Next week, we finish up Episode 9 with Part 3 and we will be spending some quality time on the future of Search, data sovereignty, and the PEOPLE in your organization.
The Calculus of IT website - https://www.thecoit.us
"The New IT Leader's Survival Guide" Book - https://www.longwalk.consulting/library
"The Calculus of IT" Book - https://www.longwalk.consulting/library
The COIT Merchandise Store - https://thecoit.myspreadshop.com
Donate to Wikimedia - https://donate.wikimedia.org/wiki/Ways_to_Give
Buy us a Beer!! - https://www.buymeacoffee.com/thecalculusofit
Youtube - @thecalculusofit
Slack - Invite Link
Email - nate@thecoit.us
Email - mike@thecoit.us
WEBVTT
1
00:00:01.970 --> 00:00:02.890
Nathan McBride: Promise
2
00:01:14.240 --> 00:01:20.139
Nathan McBride: you're listening to the essential mix with Paul Oakenfold coming live from cream in Liverpool
3
00:06:19.570 --> 00:06:20.390
Nathan McBride: script.
4
00:07:14.600 --> 00:07:15.310
Nathan McBride: Oh, shit.
5
00:07:46.910 --> 00:07:47.940
Nathan McBride: sin, Shirley.
6
00:09:44.520 --> 00:09:45.200
Nathan McBride: Let's see.
7
00:11:04.410 --> 00:11:05.090
Nathan McBride: I think
8
00:11:08.730 --> 00:11:09.420
Nathan McBride: that's great
9
00:11:22.700 --> 00:11:23.390
Nathan McBride: with you.
10
00:12:02.760 --> 00:12:04.650
Nathan McBride: 1 1 1
11
00:15:10.280 --> 00:15:16.009
Nathan McBride: you're listening to the essential mix with Paul Oakenfold coming live from cream in Liverpool.
12
00:15:22.880 --> 00:15:26.570
Nathan McBride: Yeah, especially, it is essential.
13
00:15:55.490 --> 00:15:57.220
Mike Crispin: Yo yo yo!
14
00:16:04.380 --> 00:16:05.080
Mike Crispin: What's up, man?
15
00:16:28.710 --> 00:16:29.860
Nathan McBride: Thank you for you
16
00:16:43.420 --> 00:16:49.650
Nathan McBride: that my young friend was the one and only Paul, who can pull.
17
00:16:49.930 --> 00:16:53.119
Mike Crispin: Shocker. I knew you'd be listening to him for some reason.
18
00:16:53.390 --> 00:17:00.790
Nathan McBride: Well, I have a cream in Liverpool, November 30, th 1997.
19
00:17:02.330 --> 00:17:04.410
Nathan McBride: Fucking epic. Mix.
20
00:17:05.180 --> 00:17:07.310
Mike Crispin: Awesome. Yeah, I I have been
21
00:17:08.190 --> 00:17:12.159
Mike Crispin: soundclouding and listening to your list on Youtube.
22
00:17:12.640 --> 00:17:18.639
Mike Crispin: Just so much great stuff, endless, endless stuff to listen to, and we'll never run out of new things.
23
00:17:19.210 --> 00:17:20.259
Mike Crispin: It's pretty awesome.
24
00:17:20.970 --> 00:17:22.069
Mike Crispin: That's awesome.
25
00:17:22.470 --> 00:17:29.930
Nathan McBride: There's there's literally no way to listen to every great thing that happened between 97 and 2,001
26
00:17:30.230 --> 00:17:31.420
Nathan McBride: in our lifetime.
27
00:17:32.610 --> 00:17:36.289
Mike Crispin: I would agree. There really isn't as just so many great.
28
00:17:36.490 --> 00:17:39.479
Mike Crispin: That's what makes it fun is you can listen to something new every day, and.
29
00:17:39.850 --> 00:17:42.720
Mike Crispin: you know, get plugged in. It's awesome.
30
00:17:44.020 --> 00:17:45.069
Mike Crispin: Can't beat it.
31
00:17:46.430 --> 00:17:50.180
Nathan McBride: Nope, all right. Well, my friend.
32
00:17:51.520 --> 00:17:55.760
Nathan McBride: or as fish my friend, my friend, he's got a gun.
33
00:17:57.550 --> 00:18:00.029
Mike Crispin: Is our friend Kevin joining us tonight.
34
00:18:00.730 --> 00:18:04.309
Nathan McBride: Brent Kevin is coming back from the great North.
35
00:18:04.590 --> 00:18:05.240
Mike Crispin: Oh!
36
00:18:07.600 --> 00:18:10.880
Nathan McBride: Went up to see his son's baseball game.
37
00:18:11.340 --> 00:18:12.140
Mike Crispin: Oh, nice!
38
00:18:12.630 --> 00:18:20.510
Nathan McBride: Urge, so we might be picking him up a little bit later on the old telephone program.
39
00:18:21.200 --> 00:18:24.880
Mike Crispin: Wow! Going going old school. I love it.
40
00:18:25.320 --> 00:18:28.209
Nathan McBride: Gonna call into the calculus of it hotline.
41
00:18:28.210 --> 00:18:30.190
Mike Crispin: Yeah, the the 1 900 number.
42
00:18:30.540 --> 00:18:33.109
Nathan McBride: You know? 1 900 number. Yeah. Is it a call in
43
00:18:35.370 --> 00:18:36.100
Nathan McBride: Any answers.
44
00:18:36.100 --> 00:18:38.299
Mike Crispin: 100 COIT US.
45
00:18:38.660 --> 00:18:42.809
Nathan McBride: And answer our trick question of the day, which is.
46
00:18:43.120 --> 00:18:47.210
Nathan McBride: which is the best generative AI model to use right now.
47
00:18:49.090 --> 00:18:51.399
Mike Crispin: It'll be different tomorrow at 9 Am.
48
00:18:51.530 --> 00:18:53.189
Mike Crispin: Or whenever you listen to this.
49
00:18:53.430 --> 00:18:59.480
Nathan McBride: Exactly well and different itself is a relative interpretation, so.
50
00:18:59.910 --> 00:19:00.930
Mike Crispin: That's true.
51
00:19:01.500 --> 00:19:02.640
Nathan McBride: It really doesn't matter.
52
00:19:03.420 --> 00:19:05.199
Mike Crispin: Just pick one and have fun.
53
00:19:05.780 --> 00:19:11.650
Nathan McBride: Just pick one that's got a GAPA, T, and a, and I, and oh.
54
00:19:12.920 --> 00:19:18.430
Nathan McBride: 3 or 4, or an R or G in it.
55
00:19:18.830 --> 00:19:22.510
Nathan McBride: I might have already said you and you're good to go.
56
00:19:23.240 --> 00:19:27.409
Nathan McBride: and the answers will always be 100% true. And basically, there's nothing else to worry about.
57
00:19:28.610 --> 00:19:29.380
Mike Crispin: True that.
58
00:19:30.250 --> 00:19:33.620
Nathan McBride: True doubt to Michael.
59
00:19:36.250 --> 00:19:37.199
Nathan McBride: How are you doing.
60
00:19:37.940 --> 00:19:42.177
Mike Crispin: I'm fantastic, doing great. I'm trying to light a new candle here, and it's been
61
00:19:42.860 --> 00:19:50.090
Mike Crispin: the challenge trying to get the wrapping out of here, so I don't like this flammable covering on fire.
62
00:19:52.240 --> 00:19:53.389
Mike Crispin: That would be bad.
63
00:19:54.280 --> 00:19:57.890
Nathan McBride: But it would be faster if you did actually light
64
00:19:59.370 --> 00:20:02.640
Nathan McBride: the flail we're covering on fire. So you're out.
65
00:20:03.520 --> 00:20:06.999
Mike Crispin: I think the effing things melted into the candle. That's why.
66
00:20:07.370 --> 00:20:11.310
Mike Crispin: Hold on, I know what I can do. You can do this. I've got a little scraper.
67
00:20:11.450 --> 00:20:14.470
Mike Crispin: Let's get the scraper in. There, there we go!
68
00:20:15.270 --> 00:20:18.819
Mike Crispin: Oh, that smells like it's smells like a bathroom.
69
00:20:19.200 --> 00:20:21.760
Mike Crispin: Sounds like a bathroom candle. Let's see what this one is.
70
00:20:22.190 --> 00:20:24.329
Nathan McBride: It's like the bathrooms, Chris.
71
00:20:26.170 --> 00:20:28.169
Mike Crispin: This one smells like a nightclub
72
00:20:28.900 --> 00:20:36.900
Mike Crispin: like well, like a like a nightclub before anyone's in there, you know, they have the incense and all of their shit going, you know, or back when we were when we used to go
73
00:20:37.150 --> 00:20:42.329
Mike Crispin: 1 3rd cologne, 1 3rd piss, and 1 3rd and other stuff.
74
00:20:42.620 --> 00:20:43.190
Nathan McBride: Yeah.
75
00:20:44.270 --> 00:20:45.480
Mike Crispin: Farts and beer.
76
00:20:46.770 --> 00:20:50.360
Nathan McBride: We say when it comes to clubs when we used to go?
77
00:20:50.570 --> 00:20:54.790
Nathan McBride: Why isn't it ever like when we just went.
78
00:20:55.830 --> 00:21:00.380
Mike Crispin: Well, cause we don't live in New York City. And we're doing Vegas.
79
00:21:02.160 --> 00:21:07.750
Mike Crispin: Actually, I'm I'm not. It's not fair to say that. I mean, I just don't know. Do you know of any places in Boston that
80
00:21:07.890 --> 00:21:13.100
Mike Crispin: play the music we like like after hours and stuff? I'm just so unplugged. I don't know anything
81
00:21:13.950 --> 00:21:18.010
Mike Crispin: dance music, but they don't. You know. Djs open until midnight by midnight.
82
00:21:18.170 --> 00:21:23.050
Nathan McBride: Apparently already getting up for the next morning, so.
83
00:21:23.050 --> 00:21:26.860
Mike Crispin: I mean, like there's do. They still do raves and all that shit? I don't know.
84
00:21:28.920 --> 00:21:31.159
Nathan McBride: That would be cool. Here's what we should do.
85
00:21:31.730 --> 00:21:37.340
Nathan McBride: In addition to our music, podcast we should start early raves whatever.
86
00:21:37.340 --> 00:21:43.580
Nathan McBride: What if it didn't rave, and started at like 4 o'clock in the afternoon, and went till midnight.
87
00:21:43.730 --> 00:21:45.070
Nathan McBride: It's still 8 h long.
88
00:21:45.570 --> 00:21:48.140
Mike Crispin: I think we could manage that. We have plenty of time.
89
00:21:48.140 --> 00:21:57.929
Nathan McBride: We tell the kids. Listen. You don't need to start at midnight. You can start at a reasonable time in the afternoon, and then go till midnight, and then get some good sleep.
90
00:21:59.830 --> 00:22:01.340
Nathan McBride: Practice, good sleeping habits.
91
00:22:03.560 --> 00:22:08.480
Mike Crispin: We have. We we have. We have plenty of time, so we could do a number.
92
00:22:08.480 --> 00:22:08.810
Nathan McBride: Have.
93
00:22:08.810 --> 00:22:09.490
Mike Crispin: But I'm more.
94
00:22:09.490 --> 00:22:18.289
Nathan McBride: Different different drugs dispersed throughout the night like 4 o'clock. You get all of your like amphetamines, Mescaline. You get all the good drugs.
95
00:22:18.390 --> 00:22:21.210
Nathan McBride: Then, like around 10 Pm.
96
00:22:21.660 --> 00:22:29.180
Nathan McBride: You get some like laced. Come downers. And then by 1130 Pm. All the drugs are literally all
97
00:22:29.750 --> 00:22:31.280
Nathan McBride: like, go to bed drugs.
98
00:22:32.140 --> 00:22:32.780
Mike Crispin: Yeah.
99
00:22:33.700 --> 00:22:34.699
Nathan McBride: I'm telling you.
100
00:22:35.000 --> 00:22:36.269
Nathan McBride: There's a market for this.
101
00:22:37.420 --> 00:22:39.996
Mike Crispin: There is, and that's the amazing thing is
102
00:22:43.760 --> 00:22:45.630
Mike Crispin: there's like, I think, of
103
00:22:45.890 --> 00:22:52.110
Mike Crispin: all of the curation and this like we talked about last week a little bit like, I think this is an area where in music
104
00:22:52.873 --> 00:22:57.000
Mike Crispin: there's gonna be even. There's gonna be like reemergence of dance music.
105
00:22:57.000 --> 00:22:59.180
Nathan McBride: I just. I just actually had an idea.
106
00:22:59.890 --> 00:23:07.060
Nathan McBride: Fire festival, but without without all of the the the lying, and the stealing.
107
00:23:07.696 --> 00:23:10.109
Nathan McBride: We do fire festival for old people.
108
00:23:12.070 --> 00:23:19.310
Nathan McBride: No, hear me out, so you show up at 3 o'clock, and Yacht Rock will be playing like 3 to 4 yacht Rock Intro.
109
00:23:19.620 --> 00:23:20.200
Mike Crispin: Yep.
110
00:23:20.520 --> 00:23:25.729
Nathan McBride: Then we ease them into some banging like late nineties. Early 2 thousands trends.
111
00:23:27.151 --> 00:23:30.119
Nathan McBride: Make sure there's like a nice salad buffet.
112
00:23:30.610 --> 00:23:31.250
Mike Crispin: Sure.
113
00:23:31.520 --> 00:23:33.880
Nathan McBride: It's over on this side, rolling.
114
00:23:34.170 --> 00:23:37.909
Nathan McBride: Feed them. Cobious amounts of just basically sedatives
115
00:23:38.580 --> 00:23:46.100
Nathan McBride: nothing even remotely upper just sedatives, but enough to keep them going, but also then fuel them with alcohol.
116
00:23:47.360 --> 00:23:51.810
Nathan McBride: But the best part is right. Around 10 o'clock at night we start playing the podcast
117
00:23:52.210 --> 00:23:55.820
Nathan McBride: like, I pick a random episode of the podcast we start playing it.
118
00:23:57.170 --> 00:24:00.570
Nathan McBride: Are you over top of the over top of the trance, dubbing.
119
00:24:00.790 --> 00:24:01.430
Mike Crispin: Yep.
120
00:24:02.580 --> 00:24:07.199
Nathan McBride: Imagine our like, pick any episode and put it over a sick trans. Beat.
121
00:24:08.960 --> 00:24:10.940
Mike Crispin: That'd be awesome.
122
00:24:11.740 --> 00:24:13.159
Nathan McBride: All right. I'm gonna put it together.
123
00:24:13.920 --> 00:24:17.739
Nathan McBride: Stand by. But we could sell Al Co.
124
00:24:18.000 --> 00:24:21.180
Nathan McBride: All there the alcohol.
125
00:24:23.700 --> 00:24:24.649
Nathan McBride: What are you doing.
126
00:24:25.630 --> 00:24:26.290
Mike Crispin: What's that?
127
00:24:26.620 --> 00:24:27.409
Nathan McBride: How are you doing.
128
00:24:27.960 --> 00:24:29.250
Mike Crispin: Staring at the screen.
129
00:24:29.250 --> 00:24:30.840
Nathan McBride: Keep my order right now.
130
00:24:30.990 --> 00:24:31.960
Mike Crispin: What's that?
131
00:24:32.330 --> 00:24:33.069
Nathan McBride: Right now.
132
00:24:33.530 --> 00:24:39.150
Mike Crispin: No, I'm I'm I'm trying to just get I I just making sure I had the drive the script open
133
00:24:41.750 --> 00:24:46.230
Mike Crispin: I just got I was just ran down here. I'm sorry I'm I'm I'm plugged in, ready to rock.
134
00:24:46.630 --> 00:24:47.810
Nathan McBride: Okay. Okay.
135
00:24:48.110 --> 00:24:50.050
Mike Crispin: Yeah, I'm good. I'm good. I'm good. I'm ready to run.
136
00:24:50.150 --> 00:24:52.840
Nathan McBride: Kingdom State. By the way, I left Valhalla.
137
00:24:53.310 --> 00:24:54.190
Mike Crispin: What's that?
138
00:24:54.410 --> 00:24:57.110
Nathan McBride: I left Valhalla, the kingdom Naorna.
139
00:24:57.110 --> 00:25:03.580
Mike Crispin: Oh, so so that's the thing. I don't know how to get connected with you. How do I get like in the same kingdom as you do. I have to be near you.
140
00:25:03.980 --> 00:25:06.389
Nathan McBride: Friend me first.st It's Pouncey Silver Kitten.
141
00:25:06.540 --> 00:25:08.400
Mike Crispin: I've I already. I've already connected with you.
142
00:25:08.400 --> 00:25:13.580
Nathan McBride: Yeah, we're connected. Now you have to join my kingdom, and I'm in the I'm in Pantheon of the Golden Sun.
143
00:25:14.200 --> 00:25:15.580
Mike Crispin: So how do I do that.
144
00:25:16.857 --> 00:25:23.449
Nathan McBride: Well, actually, this kingdoms, it might be full at the moment, but next time it's open I'll let you know, and then you would apply to be in there.
145
00:25:24.970 --> 00:25:27.269
Nathan McBride: You gotta get over 100. You gotta.
146
00:25:27.270 --> 00:25:29.842
Mike Crispin: I'm in level. I'm in level 75 right now.
147
00:25:30.450 --> 00:25:33.270
Nathan McBride: Gotta work some nights and weekends for a couple of weeks.
148
00:25:33.990 --> 00:25:37.700
Mike Crispin: Yeah, I'm not. I'm not there yet. I definitely have some
149
00:25:38.500 --> 00:25:42.279
Mike Crispin: catching up today. Let me see where I am. Yeah, I'm level 75.
150
00:25:42.800 --> 00:25:46.599
Nathan McBride: Okay, 25 more levels that you can come into my kingdom that I'm in now.
151
00:25:47.930 --> 00:25:48.810
Mike Crispin: Jesus.
152
00:25:51.070 --> 00:26:02.099
Mike Crispin: I'm playing Orna now. So now now I'm telling you that's what I'm doing all good, all good.
153
00:26:02.990 --> 00:26:10.290
Nathan McBride: Well, welcome back everybody to the calculus podcast and welcome to episode 9, part 2 of season 2.
154
00:26:11.240 --> 00:26:14.450
Nathan McBride: Last week we got supremely Aiaf.
155
00:26:14.970 --> 00:26:16.079
Mike Crispin: Yes, we did.
156
00:26:16.080 --> 00:26:16.760
Nathan McBride: So.
157
00:26:16.760 --> 00:26:17.360
Mike Crispin: Awesome.
158
00:26:17.630 --> 00:26:20.500
Nathan McBride: I know so much. A and F
159
00:26:21.253 --> 00:26:22.819
Nathan McBride: I'm still a little hungover.
160
00:26:24.110 --> 00:26:27.420
Mike Crispin: That's true. There was a lot of AI as well.
161
00:26:28.360 --> 00:26:28.930
Nathan McBride: Yep.
162
00:26:29.670 --> 00:26:30.529
Mike Crispin: For sure.
163
00:26:30.920 --> 00:26:32.120
Nathan McBride: An af perspective
164
00:26:32.520 --> 00:26:33.140
Mike Crispin: Yeah.
165
00:26:33.320 --> 00:26:45.430
Nathan McBride: Mr. Dushney is on the road, so he may be able to join us tonight some at some point in the episode. We hope he can get in here, otherwise it's myself. And once again the inevitable Michael J. Crispin.
166
00:26:45.740 --> 00:26:47.710
Mike Crispin: Michael J. Crispin is right.
167
00:26:47.710 --> 00:26:52.790
Nathan McBride: Jose Crispin. Hello again. Everybody. If you missed last week's episode.
168
00:26:54.000 --> 00:26:57.329
Nathan McBride: And this is, I'm taking a different approach this week. Okay.
169
00:26:57.740 --> 00:27:01.339
Nathan McBride: if you missed last week's episode. Please go listen to it.
170
00:27:02.081 --> 00:27:03.890
Mike Crispin: Good idea. I like that.
171
00:27:03.890 --> 00:27:05.180
Nathan McBride: You try all right.
172
00:27:05.310 --> 00:27:08.610
Nathan McBride: Instead of saying what the fuck is wrong with you, I'm trying a new approach.
173
00:27:09.590 --> 00:27:11.189
Mike Crispin: Go listen to it, please.
174
00:27:11.570 --> 00:27:15.560
Nathan McBride: Please go listen to it. Okay, so that's all. I'm not gonna say anything else.
175
00:27:16.030 --> 00:27:23.589
Nathan McBride: But I will say that part 2 of of Episode 9 will make any sense if you don't listen to it.
176
00:27:23.730 --> 00:27:28.320
Nathan McBride: So put the fucking shit down that you're doing, and go listen to it, Oops. I screwed up.
177
00:27:30.810 --> 00:27:32.440
Nathan McBride: Please go listen to part 1 1, st
178
00:27:33.860 --> 00:27:36.469
Nathan McBride: and then come and listen to part 2. It'll make more sense.
179
00:27:38.750 --> 00:27:44.920
Nathan McBride: So last week in part one where we had Mr. Dushley here Mike Kevin and I dove into
180
00:27:46.081 --> 00:27:48.090
Nathan McBride: I mean, really, the episode
181
00:27:48.250 --> 00:27:55.169
Nathan McBride: episode 9 itself is about emerging technologies and their impact on autonomy on it. I should say
182
00:27:55.820 --> 00:28:03.620
Nathan McBride: so if you missed episode one. Here's the Tldr version which is, we explored the autonomy paradox of AI integration.
183
00:28:04.580 --> 00:28:11.580
Nathan McBride: which is how AI can simultaneously grant it. Leaders more control while quietly creating new dependencies.
184
00:28:11.800 --> 00:28:20.350
Nathan McBride: talked about the need for frameworks to to evaluate AI's impact on decision-making power, skill development and data control.
185
00:28:20.900 --> 00:28:29.610
Nathan McBride: We talked about risk management transformation as AI introduces challenges like algorithmic bias, explanation gaps.
186
00:28:29.970 --> 00:28:34.710
Nathan McBride: and we also discussed how to avoid the slippery slope of vendor AI model lock in
187
00:28:34.970 --> 00:28:37.370
Nathan McBride: while still leveraging cutting edge tools.
188
00:28:37.590 --> 00:28:39.450
Nathan McBride: This week
189
00:28:39.650 --> 00:28:47.779
Nathan McBride: we have a few leftover AI crumbs we were hitting on the 2 h mark last week, so we put up the pause on a couple of things. But we're gonna
190
00:28:48.291 --> 00:29:00.060
Nathan McBride: get our heads around the last bits of AI discussion this week. We're gonna dive into Mike's favorite quantum computing what it means for encryption in the future of it. Security. We're gonna talk a bit about edge computing.
191
00:29:01.090 --> 00:29:08.049
Nathan McBride: which always cracks me up because edge computing is all computing. So.
192
00:29:08.200 --> 00:29:20.284
Nathan McBride: But there's a term. And it's got a thing. And it means something people. So we're talking about edge computing and how decentralizing it reshapes autonomy and architecture around it. And then we're gonna talk about
193
00:29:21.590 --> 00:29:33.099
Nathan McBride: what is basically the elephant room for most companies these days, the building and maintaining expertise in an era of accelerating technology cycles. So no one wants to talk about it.
194
00:29:33.300 --> 00:29:36.029
Nathan McBride: Everyone's looking around the room saying, who's replaceable.
195
00:29:36.300 --> 00:29:38.530
Nathan McBride: We're gonna try and get into this
196
00:29:39.730 --> 00:29:42.699
Nathan McBride: pretty pretty pretty pretty good stuff.
197
00:29:44.800 --> 00:29:47.807
Mike Crispin: Yep, yeah, I think the quantum piece is.
198
00:29:48.630 --> 00:29:51.420
Mike Crispin: there's so many unknowns about that. But we'll talk about that.
199
00:29:53.160 --> 00:30:01.610
Nathan McBride: Yup, and I'm remiss because I meant to get the jobs that they done today before the episode.
200
00:30:01.860 --> 00:30:03.759
Nathan McBride: But I was busy.
201
00:30:05.250 --> 00:30:06.110
Mike Crispin: Oh!
202
00:30:06.110 --> 00:30:07.340
Nathan McBride: I promise
203
00:30:07.860 --> 00:30:13.900
Nathan McBride: to the calculus of it nation at large. And next week we will have the most epic jobs. Update
204
00:30:14.230 --> 00:30:18.654
Nathan McBride: for you. But I want to check and see if manscaped is still
205
00:30:21.530 --> 00:30:27.330
Mike Crispin: I bet I bet that's that one has been. I wanna look as well. I bet that one's been.
206
00:30:28.680 --> 00:30:30.270
Nathan McBride: Well, let's find out.
207
00:30:42.780 --> 00:30:46.379
Nathan McBride: yes. The Vp of IC at manscaped has been snatched up.
208
00:30:46.730 --> 00:30:50.669
Nathan McBride: They're not looking for a senior back end engineer.
209
00:30:51.870 --> 00:30:53.790
Nathan McBride: Excuse me, your software engineer.
210
00:30:54.447 --> 00:30:58.950
Mike Crispin: Sorry, but I just sorry. I thought you were kidding.
211
00:30:58.950 --> 00:31:01.250
Nathan McBride: Of enterprise, apps.
212
00:31:02.120 --> 00:31:03.640
Mike Crispin: Holy crap!
213
00:31:04.130 --> 00:31:04.800
Nathan McBride: Yep.
214
00:31:05.130 --> 00:31:07.859
Mike Crispin: Oh, that! But those 2 were open before right.
215
00:31:08.090 --> 00:31:08.820
Nathan McBride: Yes.
216
00:31:08.820 --> 00:31:10.190
Mike Crispin: Yeah, yeah, okay.
217
00:31:12.500 --> 00:31:13.470
Mike Crispin: Gotcha.
218
00:31:13.910 --> 00:31:17.499
Nathan McBride: So sorry the Vp of it role that manscaped is gone.
219
00:31:19.490 --> 00:31:20.860
Nathan McBride: We have a slackboard.
220
00:31:21.000 --> 00:31:22.880
Nathan McBride: Does anybody even know what slack is.
221
00:31:23.710 --> 00:31:25.210
Mike Crispin: No. Can you tell us about it?
222
00:31:25.520 --> 00:31:31.640
Nathan McBride: It's like what would happen if we took Microsoft teams and then completely ripped out all the code and replace it with the code from slack.
223
00:31:32.470 --> 00:31:38.059
Nathan McBride: so go through our slide board, hang out, talk.
224
00:31:38.365 --> 00:31:38.670
Mike Crispin: Top.
225
00:31:38.670 --> 00:31:47.419
Nathan McBride: About teams, or talk about how much teams has changed your life. Positively, whatever we don't care. Just come on the slack board. We're not going to use teams and hang out.
226
00:31:47.720 --> 00:31:52.010
Nathan McBride: We use discord, but we tried that, and no one came to discord because it's called discord.
227
00:31:52.150 --> 00:31:57.530
Mike Crispin: They're afraid that people might see worse. What are the games they play? And you know.
228
00:31:57.530 --> 00:32:04.540
Nathan McBride: People like. Oh, I don't play fortnite dude. It's right there on your screen. You're playing it right now as we're talking.
229
00:32:05.560 --> 00:32:06.370
Nathan McBride: What are you talking about?
230
00:32:06.370 --> 00:32:21.370
Mike Crispin: Too bad, because the discord was there was so much potential, though what we could have done with that that we couldn't. What we can't do in slack. But I guess we're still too serious of a podcast to to professionally aligned, to have a more playful.
231
00:32:21.370 --> 00:32:23.270
Nathan McBride: But this court it'd just be you and me.
232
00:32:23.470 --> 00:32:27.350
Mike Crispin: That's what I mean, like, it was kind of like, oh, this is a little bit different. It'll be
233
00:32:27.450 --> 00:32:33.980
Mike Crispin: more fun and games, and everyone wanted business and real names and everything. It's like, all right. Fine.
234
00:32:33.980 --> 00:32:42.350
Nathan McBride: Level of people that are like teams. I can do slack. It's a reach. But discord, mastodon, blue sky can't do it.
235
00:32:42.660 --> 00:32:43.620
Mike Crispin: Yeah, that would be.
236
00:32:43.620 --> 00:32:45.269
Nathan McBride: Complicated, to understand.
237
00:32:45.270 --> 00:32:49.940
Mike Crispin: Too unprofessional. Not if someone saw me on here they'd be. What are they? What am I doing on discord?
238
00:32:50.380 --> 00:32:50.940
Nathan McBride: So.
239
00:32:51.150 --> 00:32:55.590
Mike Crispin: I can't. Well, it's probably blocked, too. They probably block it at work. So yeah.
240
00:32:55.590 --> 00:32:56.881
Mike Crispin: they get on it. But.
241
00:32:58.230 --> 00:33:04.210
Nathan McBride: If you like our show, or if your mother likes our show, or like your neighbors.
242
00:33:04.730 --> 00:33:11.609
Nathan McBride: just by proxy, you can give us 5 stars, that's all there's to it. I don't know on whatever platform. Listen to buys beers.
243
00:33:12.080 --> 00:33:18.010
Nathan McBride: I'm literally drinking towards the bottom of my mountain gay room. I'm like at the
244
00:33:18.280 --> 00:33:21.880
Nathan McBride: I'm really scraping the cabinets for for booze, so we need some cash.
245
00:33:23.510 --> 00:33:24.530
Nathan McBride: Alright. Where were we?
246
00:33:26.530 --> 00:33:27.520
Nathan McBride: AI stuff?
247
00:33:27.750 --> 00:33:29.649
Nathan McBride: Yes, back to AI and quantum.
248
00:33:30.480 --> 00:33:34.519
Nathan McBride: Well, AI 1st and quantum, Mike, we don't want to conflate the 2.
249
00:33:34.520 --> 00:33:38.050
Mike Crispin: Oh, okay, yeah, we wanna we wanna we wanna talk about AI first, st absolutely.
250
00:33:38.354 --> 00:33:56.949
Nathan McBride: One existential crisis before that we get to the next one. So let's just kind of like break them up into 2 different buckets. So we left off last week talking about all kinds of amazing AI stuff with regards to autonomy. This week. I wanted to close that loop couple of small items. First, st the speed versus control trade off.
251
00:33:58.490 --> 00:34:01.370
Nathan McBride: Sure, I think.
252
00:34:02.120 --> 00:34:05.760
Nathan McBride: And what we're seeing is the is this
253
00:34:06.000 --> 00:34:11.549
Nathan McBride: the insanity around the speed at which either AI decisions are made because of AI
254
00:34:13.590 --> 00:34:18.099
Nathan McBride: or decisions are being made because people think they're the right decisions based on AI.
255
00:34:18.980 --> 00:34:19.900
Nathan McBride: But.
256
00:34:19.900 --> 00:34:20.449
Mike Crispin: Sure.
257
00:34:20.630 --> 00:34:23.740
Nathan McBride: All of a sudden this, this whole AI effect.
258
00:34:23.940 --> 00:34:42.830
Nathan McBride: has given people the idea where well, before I would have like done some time doing a functional requirements back talking to vendors and the phones. And now I can just get 100 recommendations in 2 seconds. This is great, and then tell it to pick one for me. Oh, pick this one the great, and I'll use that one.
259
00:34:43.350 --> 00:34:50.330
Nathan McBride: But where the problem is, this creates an enormous competitive advantage problem.
260
00:34:51.380 --> 00:34:52.130
Nathan McBride: So
261
00:34:52.940 --> 00:34:58.719
Nathan McBride: using AI for fraud, detection and response, there's lots of platforms now. Of course, everyone has to say it
262
00:34:59.210 --> 00:35:02.469
Nathan McBride: that they include generative A
263
00:35:03.540 --> 00:35:15.450
Nathan McBride: could identify unusual patterns and make decisions about transactional legitimacy and implement protective measures in middle milliseconds. This is great. It gives them a an edge over the competitors.
264
00:35:16.020 --> 00:35:23.860
Nathan McBride: But it comes with a trade off, which is that I go in out. And I'm like, Okay, there's 10 vendors according to AI, that are all amazing.
265
00:35:23.970 --> 00:35:34.330
Nathan McBride: And they all have AI. So I'm going to implement this one over here because it has the promise that it reduces the the to make the maximum reduction in human intervention for me.
266
00:35:35.245 --> 00:35:38.620
Nathan McBride: But what it leads to is reduced human oversight.
267
00:35:39.260 --> 00:35:42.709
Nathan McBride: So yeah, you're trading up all this wonderful speed.
268
00:35:45.140 --> 00:35:49.639
Nathan McBride: But there's no practical way for humans to review every decision before it's implemented.
269
00:35:50.930 --> 00:35:59.609
Nathan McBride: There's literally no way to do this, and it creates a new form of dependency, which is that reliance on AI to make good decisions for you without your supervision.
270
00:35:59.830 --> 00:36:06.690
Nathan McBride: Now, sure, before you respond, I will respond to this, which is to say, well.
271
00:36:07.180 --> 00:36:14.020
Nathan McBride: no, no, that that's not me. See, what we do is we use AI to make decisions, and then we review those decisions. Then we implement them.
272
00:36:14.210 --> 00:36:18.950
Nathan McBride: So then why are you using it so? I'll pause right there.
273
00:36:19.767 --> 00:36:26.944
Mike Crispin: I think it's I think it's good to use AI as a guide. If you're especially if you're not an expert at something, it's good to have some
274
00:36:27.810 --> 00:36:32.419
Mike Crispin: guidance and some ideas of who to reach out to, and who to talk to
275
00:36:32.600 --> 00:36:38.429
Mike Crispin: and to nuance your query, to have it fit your particular needs and your requirements.
276
00:36:38.780 --> 00:36:43.219
Mike Crispin: I can give you a different answer, or relatively different answer every time you what you ask.
277
00:36:43.340 --> 00:36:49.960
Mike Crispin: So if you've you need to have like we've said before is you need to have a a good
278
00:36:50.200 --> 00:36:53.090
Mike Crispin: prompt, or a good query, or a loaded question
279
00:36:53.820 --> 00:36:59.680
Mike Crispin: to you know, to basically be able to get the most out of AI. And you, you know, you're not. Gonna just
280
00:37:00.350 --> 00:37:13.870
Mike Crispin: think a lot of decision. Makers aren't gonna just say, well, AI told me, and I'm gonna just that's what I'm gonna do. I think they're gonna use it as a as a good way to get maybe something started to get a few ideas brewing the same way that people will call up their gardener analyst and ask.
281
00:37:14.370 --> 00:37:17.930
Mike Crispin: Oh, who should I talk to in this space, you know.
282
00:37:18.470 --> 00:37:25.899
Mike Crispin: or sometimes these Google or Youtube. And to get ideas, I think that's a good way to start. I think the difference here is that you can follow up
283
00:37:27.154 --> 00:37:35.520
Mike Crispin: with the AI with more requirements and get a more nuanced answer over time. But, like you said, you gotta do those post reviews.
284
00:37:35.980 --> 00:37:47.859
Mike Crispin: and it's always good to get an idea upfront to either. Affirm your kind of your initial thoughts, or to give you a good guide to start in an area you may have less experience in.
285
00:37:49.490 --> 00:37:52.220
Nathan McBride: Those are all perfect points.
286
00:37:53.420 --> 00:37:57.750
Nathan McBride: A few weeks back I posted a thing on Linkedin about the AI velocity wall.
287
00:37:57.990 --> 00:37:59.010
Mike Crispin: Okay. Yeah.
288
00:38:00.260 --> 00:38:07.970
Nathan McBride: Which is the essentially the autonomy, velocity, paradox, which is, yes, the faster you are with AI,
289
00:38:08.090 --> 00:38:12.630
Nathan McBride: the more autonomy, you autonomy, you see, for the same time.
290
00:38:13.200 --> 00:38:17.429
Nathan McBride: no matter how fast you move, there's always someone downstream that's slower than you.
291
00:38:18.510 --> 00:38:24.050
Mike Crispin: Yep, and I think that's true. With governance and automation, and any number of things.
292
00:38:24.050 --> 00:38:24.600
Nathan McBride: Important.
293
00:38:24.600 --> 00:38:25.950
Mike Crispin: Yeah, across the board. Yeah.
294
00:38:26.410 --> 00:38:29.720
Nathan McBride: Whenever you're going to implement something that's a speed improvement
295
00:38:29.950 --> 00:38:35.539
Nathan McBride: or decision making improvement, it's only as relevant as the next stage in the process.
296
00:38:35.540 --> 00:38:35.980
Mike Crispin: That's right.
297
00:38:36.417 --> 00:38:41.230
Nathan McBride: So the question is always, should you only approve your process
298
00:38:41.670 --> 00:38:44.700
Nathan McBride: to be as fast as the slowest person? Or should you
299
00:38:45.030 --> 00:38:49.790
Nathan McBride: go after the slowest people, and improve their process to bring all the other processes up to speed.
300
00:38:50.080 --> 00:38:55.000
Nathan McBride: And you can take AI from both perspectives. But
301
00:38:55.370 --> 00:39:05.719
Nathan McBride: you can also do some more work on your own, which is number one test out circuit breakers, which are condition which AI has to pause and request human. Input.
302
00:39:06.040 --> 00:39:10.410
Nathan McBride: And I'm finding that like, for instance, I'm building a clms for the company right now
303
00:39:10.927 --> 00:39:21.029
Nathan McBride: and working with our contracts manager. We put in these circuit breakers. We're putting in points where AI is doing like Xyz up to a point, and then it pauses and waits for her.
304
00:39:21.830 --> 00:39:22.630
Nathan McBride: It's great.
305
00:39:22.630 --> 00:39:23.380
Mike Crispin: No, that makes sense.
306
00:39:23.380 --> 00:39:28.419
Nathan McBride: You've got to do a Qc. Time, Qc. Time on it. We're not like trusting the AI to the Qc. Moment.
307
00:39:29.310 --> 00:39:34.710
Nathan McBride: There's oversight oversight layers which is either secondary. AI AI
308
00:39:34.820 --> 00:39:40.639
Nathan McBride: systems that monitor the decision or a human being that monitors decision and flags issues.
309
00:39:40.970 --> 00:39:46.489
Nathan McBride: So yes, let AI run. But then have a parallel process that's checking that. AI,
310
00:39:46.820 --> 00:39:49.860
Nathan McBride: whether it's AI, another AI or a human being.
311
00:39:50.490 --> 00:39:56.780
Nathan McBride: Then there's Post Action Review, which is regular human review of AI decisions to identify patterns and make adjustments.
312
00:39:56.890 --> 00:39:59.370
Nathan McBride: So basic process improvement control.
313
00:40:01.790 --> 00:40:07.669
Nathan McBride: There's a bounded autonomy which is here are the parameters within which AI can actually work.
314
00:40:07.930 --> 00:40:10.370
Nathan McBride: It can't go outside of these rails.
315
00:40:11.240 --> 00:40:20.649
Nathan McBride: And then, lastly, is value alignment which is ensuring. AI is explicitly trained to optimize for your
316
00:40:21.000 --> 00:40:25.549
Nathan McBride: your decision making process. So you can't just use an implicit Llm
317
00:40:25.670 --> 00:40:28.860
Nathan McBride: for decision making because you're gonna get the phone book
318
00:40:29.180 --> 00:40:29.730
Mike Crispin: Exactly.
319
00:40:29.730 --> 00:40:33.530
Nathan McBride: An explicit Llm. To do this kind of work, and so.
320
00:40:33.530 --> 00:40:37.860
Mike Crispin: Absolutely. I like that. You've got sort of the circuit breaker approach. And those gates
321
00:40:38.160 --> 00:40:45.890
Mike Crispin: are important. That, you're not handing the entire thing over. So that, like we said last week, when you don't know if something works
322
00:40:46.654 --> 00:40:52.670
Mike Crispin: or if you're trying to move very fast and completely delegate to AI, you're you're gonna lose any.
323
00:40:52.820 --> 00:41:01.299
Mike Crispin: You're gonna lose any control or any ability to fix and or respond to issues because you're just not gonna know how it works. And
324
00:41:01.850 --> 00:41:12.720
Mike Crispin: in our like, I said in our also last week in our industry, you have to know how data gets from 1 point to another, or how data gets transformed, or how it? What's the data integrity.
325
00:41:12.840 --> 00:41:16.900
Mike Crispin: So whether it's a contract lifecycle management system, or it's your
326
00:41:17.080 --> 00:41:28.739
Mike Crispin: discovery data or your clinical data. It's like you can use AI to help you do a number any number of things, but you have to know how it got the answer. And I think that's where some of the
327
00:41:29.140 --> 00:41:30.835
Mike Crispin: the challenge lies.
328
00:41:31.760 --> 00:41:39.499
Mike Crispin: though I think the research models that are out and those agent based research engines that are being built into all the
329
00:41:40.010 --> 00:41:46.160
Mike Crispin: top. 5 platforms AI platforms right now give you kind of an audit trail of the thinking
330
00:41:46.290 --> 00:41:49.040
Mike Crispin: and the sourcing and everything which is nice, but
331
00:41:50.140 --> 00:41:57.949
Mike Crispin: it's it's interpreting it a certain way as well that it's gonna be hard to prove. And I think that's where the real crux of the matter is. And all this stuff is.
332
00:41:58.940 --> 00:42:01.710
Mike Crispin: you know, at some point when when does
333
00:42:02.944 --> 00:42:10.470
Mike Crispin: regulatory authorities? And and and you know, governments? And or this is the
334
00:42:10.580 --> 00:42:14.470
Mike Crispin: policy owners, safety owners of a lot of these
335
00:42:15.020 --> 00:42:18.099
Mike Crispin: kind of business models and rules across the country.
336
00:42:18.290 --> 00:42:23.639
Mike Crispin: Just say, Yeah, yeah, it's good enough, and then then we've lost a lot of autonomy because.
337
00:42:23.640 --> 00:42:24.370
Nathan McBride: Pretty much.
338
00:42:24.890 --> 00:42:31.520
Mike Crispin: Right. But and and that's the question might become is, you're losing the autonomy, or you're, you know, are you?
339
00:42:31.690 --> 00:42:32.476
Mike Crispin: Are you?
340
00:42:34.480 --> 00:42:40.414
Mike Crispin: That's now just a given, you know. That's what's everyone to do for fun?
341
00:42:40.970 --> 00:42:46.170
Mike Crispin: this could, I think, really a real inflection point in the next 20 years where it's like they're gonna be a lot of people who
342
00:42:47.100 --> 00:42:52.820
Mike Crispin: I'm gonna have very much to do, I think. And it's how are we gonna react to that.
343
00:42:54.860 --> 00:42:56.540
Nathan McBride: Yeah, that's
344
00:42:57.380 --> 00:43:00.989
Mike Crispin: I know it's a quite that's a bigger philosophical discussion. But
345
00:43:01.350 --> 00:43:07.042
Mike Crispin: it when you talk about AI and quantum and implants and
346
00:43:08.790 --> 00:43:14.200
Mike Crispin: you know, and you know, and quantum brings us to a whole other world. We won't go there yet, but it's just
347
00:43:14.880 --> 00:43:18.819
Mike Crispin: we don't even know what's gonna happen, and that's why we can joke about.
348
00:43:18.820 --> 00:43:20.260
Nathan McBride: Had all the podcasts.
349
00:43:20.260 --> 00:43:28.790
Mike Crispin: All I'm saying is, you know, like, why have a roadmap? This stuff is changing so fast? You you there's no way that you can keep up.
350
00:43:28.970 --> 00:43:32.089
Mike Crispin: and it's going to change your business's direction as well.
351
00:43:32.720 --> 00:43:33.510
Mike Crispin: Well, the.
352
00:43:33.510 --> 00:43:37.989
Nathan McBride: Sure. Why haven't roadmap is a great one? Because why? The why the fuck have a roadmap.
353
00:43:38.670 --> 00:43:44.140
Mike Crispin: I guess it's a point like what's what's autonomy with a roadmap?
354
00:43:44.430 --> 00:43:45.779
Mike Crispin: You're gonna stick to the roadmap.
355
00:43:46.640 --> 00:43:50.179
Nathan McBride: This brings us all the way back to episode one, I mean ultimately.
356
00:43:50.180 --> 00:43:52.550
Mike Crispin: Not to, not to throw over the table, but like
357
00:43:53.870 --> 00:43:56.900
Mike Crispin: completely, it's completely gonna change everything.
358
00:43:57.510 --> 00:43:58.600
Nathan McBride: How?
359
00:43:59.920 --> 00:44:06.199
Nathan McBride: Where is the divide between the technological issue and the cultural issue? Because they are at.
360
00:44:06.200 --> 00:44:07.540
Mike Crispin: They're intertwined.
361
00:44:07.930 --> 00:44:09.490
Mike Crispin: They're intertwined.
362
00:44:11.270 --> 00:44:12.350
Nathan McBride: I mean huge.
363
00:44:13.360 --> 00:44:21.349
Nathan McBride: Yes, there's all this existential dread, but it's all kind of blown out of proportion. The fact is.
364
00:44:21.490 --> 00:44:30.060
Nathan McBride: we're actively seeking to speed up our world because it's a clever toy until.
365
00:44:30.060 --> 00:44:30.430
Mike Crispin: 12.
366
00:44:30.430 --> 00:44:39.520
Nathan McBride: We just until we determined. It's like, Oh, I mean, think of every rat experiment ever the rat continues to get cheese until it gets shocked.
367
00:44:40.460 --> 00:44:45.269
Mike Crispin: I don't know if it's perhaps a clever toy, I mean, I I think there's already been
368
00:44:46.210 --> 00:44:55.590
Mike Crispin: discoveries because of AI related things that you know what was the guy from deep mind? That's, you know, won the Nobel Prize for the discovery development.
369
00:44:55.590 --> 00:45:04.918
Nathan McBride: Yeah, Mike, you can, you can. There was a guy that posted on Linkedin. I forget his name. He had like 1,200 likes, and he wrote this big giant piece about
370
00:45:06.030 --> 00:45:24.869
Nathan McBride: Oh, listen! I was able to use these 3 models and a llama subset to write this scientific paper, which passed the credibility test for journal submission. And it's an existential crisis. And so I'm reading. And I asked him, I said, Where's actually the existential crisis that people can just generate shit papers.
371
00:45:25.110 --> 00:45:27.109
Mike Crispin: I'm not talking about papers, about people who are building
372
00:45:27.700 --> 00:45:31.699
Mike Crispin: drugs with this thing and that could potentially change the world. It's already happening.
373
00:45:31.870 --> 00:45:34.300
Nathan McBride: But no one's making drugs, because people are
374
00:45:35.500 --> 00:45:48.849
Nathan McBride: people are making ideas, and no question that generative AI allows for the creation of new ideas like on a scale never before seen. But there's still someone down the road
375
00:45:49.610 --> 00:45:51.159
Nathan McBride: who's sitting in queue.
376
00:45:52.430 --> 00:45:57.839
Nathan McBride: who's sitting there saying, Holy shit! I was getting like a hundred ideas a week, and now I'm getting 10,000.
377
00:45:57.970 --> 00:46:00.170
Nathan McBride: I have to go through these.
378
00:46:00.610 --> 00:46:12.900
Nathan McBride: and they're still slow as they were before they go to their boss, and their boss is like, No, we're not using AI where their boss is. Okay, you can go ahead and use AI, and they speed up the process. There's someone after them.
379
00:46:13.710 --> 00:46:14.260
Mike Crispin: Yep.
380
00:46:14.810 --> 00:46:15.150
Nathan McBride: There's not.
381
00:46:15.150 --> 00:46:19.130
Mike Crispin: All I'm all I'm saying is, I think it's gone beyond a clever
382
00:46:19.260 --> 00:46:28.390
Mike Crispin: toy. I mean, I I think this blows cloud and other things that we've been excited about the last 20 years of our career away.
383
00:46:28.670 --> 00:46:31.110
Mike Crispin: So I mean, I don't think there's any comparison.
384
00:46:31.110 --> 00:46:33.430
Nathan McBride: What level, on what level does it blow it away?
385
00:46:33.810 --> 00:46:39.630
Mike Crispin: Well the ability to automate the ability to learn things faster than ever before. The ability to give.
386
00:46:39.630 --> 00:46:43.370
Nathan McBride: People accessed information that's never been able to access it before.
387
00:46:43.370 --> 00:46:47.490
Nathan McBride: Hold on generative AI is not an automated agent. It does not automate.
388
00:46:47.490 --> 00:46:49.929
Mike Crispin: AI not generative AI AI.
389
00:46:49.930 --> 00:46:52.380
Nathan McBride: AI does not automate AI.
390
00:46:52.380 --> 00:46:57.719
Mike Crispin: It can. AI can automate absolutely it can. What are you talking about? Yes, it can.
391
00:46:57.720 --> 00:46:59.310
Nathan McBride: What do you mean? What am I talking about? There's a hint.
392
00:46:59.310 --> 00:47:01.600
Mike Crispin: It absolutely can interpreter.
393
00:47:01.600 --> 00:47:09.440
Nathan McBride: Not make a decision over here and hand off to a platform. It has no idea about it can help you guide those decision making processes. But there's no technical.
394
00:47:09.440 --> 00:47:09.870
Mike Crispin: Automate.
395
00:47:09.870 --> 00:47:10.590
Nathan McBride: Put it.
396
00:47:10.590 --> 00:47:17.477
Mike Crispin: It can make a decision and and make a call and send a web hook and call you. Can. You can automate with AI I mean, we've talked about that before.
397
00:47:17.700 --> 00:47:20.780
Nathan McBride: Okay, okay, sending a web hook. No problem. But Mike, which.
398
00:47:20.780 --> 00:47:33.309
Mike Crispin: You can do, interpret. You can use interpreter to automate interactive tasks now with with Claude. Right? So there's that type of thing that's automation that automates remedial tasks, but can can do automation.
399
00:47:33.310 --> 00:47:35.250
Nathan McBride: That's generative. AI, Mike.
400
00:47:35.510 --> 00:47:39.400
Mike Crispin: AI is gen-generative. AI is one type of AI.
401
00:47:39.400 --> 00:47:41.989
Nathan McBride: Small subsection of AI is not AI.
402
00:47:42.680 --> 00:47:43.640
Nathan McBride: It's a small.
403
00:47:43.640 --> 00:47:46.509
Mike Crispin: I agree, I disagree fully a hundred percent.
404
00:47:46.838 --> 00:47:52.649
Nathan McBride: To for General AI. To work. We have a an agent that's reading the Llm.
405
00:47:53.430 --> 00:48:06.079
Mike Crispin: Which is A which is technically in the AI. If you ask someone, AI is, they're gonna say, an Llm. Or an agent, or they're gonna ask to talk about machine learning or the data model. I mean, there, there's different types of AI right.
406
00:48:06.870 --> 00:48:12.960
Nathan McBride: Yeah, there's nlm, there's I'm sorry nlp, sure, there's
407
00:48:14.900 --> 00:48:17.170
Mike Crispin: I'm just. All I'm saying is, I don't think it's
408
00:48:17.170 --> 00:48:22.829
Mike Crispin: a clever thing anymore. I think it's emerged to be a pretty substantial
409
00:48:23.311 --> 00:48:31.139
Mike Crispin: lever in which to do business. And it's just we're at the very early infancy. But I don't think it's kind of a clever tool anymore.
410
00:48:32.260 --> 00:48:33.960
Nathan McBride: Well, so we can disagree on this point.
411
00:48:33.960 --> 00:48:34.560
Mike Crispin: Sure, yeah, we can.
412
00:48:34.560 --> 00:48:35.270
Nathan McBride: My calendar.
413
00:48:35.430 --> 00:48:39.640
Nathan McBride: No problem on that. I think generative. AI is
414
00:48:39.850 --> 00:48:52.320
Nathan McBride: what most people are thinking about when they think AI and generative AI is a very limited in the scope of AI. A generative AI agent is very limited in what it can do, and it only has a certain data source in AI,
415
00:48:52.500 --> 00:48:55.360
Nathan McBride: like a true Aa, a true AI environment
416
00:48:55.610 --> 00:49:06.450
Nathan McBride: does not have a bounded set of rules in which to operate, does not have a governance or constitution or triple H philosophy, or Lhf, it's actually just unbound.
417
00:49:07.230 --> 00:49:10.110
Nathan McBride: It's unbounded against no particular data set.
418
00:49:10.480 --> 00:49:11.130
Nathan McBride: That's correct.
419
00:49:11.130 --> 00:49:12.560
Mike Crispin: I agree. I agree.
420
00:49:12.560 --> 00:49:21.280
Nathan McBride: Nlp machine learning ASR generative. AI. They all fit as little tiny parts of AI.
421
00:49:21.980 --> 00:49:30.290
Nathan McBride: But true AI itself does not. It's not constructed them to automate generative AI automates the same way that
422
00:49:31.370 --> 00:49:35.010
Nathan McBride: well, any middleware automates, which is to say, here's a set of instructions.
423
00:49:36.330 --> 00:49:38.050
Nathan McBride: There's still a human involved.
424
00:49:38.460 --> 00:49:40.710
Kevin Dushney: It, said Middleware. Doesn't hallucinate right?
425
00:49:41.550 --> 00:49:42.029
Mike Crispin: That's right.
426
00:49:42.398 --> 00:49:44.241
Nathan McBride: Depends if you're using Boomi
427
00:49:44.740 --> 00:49:47.529
Nathan McBride: oops. Did I say that? Hi Kevin?
428
00:49:48.361 --> 00:49:52.859
Kevin Dushney: Are they? Gonna they gonna hit the name drop buzzer for Boomi, or
429
00:49:52.980 --> 00:49:54.700
Kevin Dushney: are we past that? At this point in the pod.
430
00:49:54.700 --> 00:49:58.209
Nathan McBride: I'm not name drop. I'm not name dropping boomy in a positive light.
431
00:49:58.210 --> 00:49:58.920
Mike Crispin: Ha! Ha!
432
00:50:00.120 --> 00:50:05.759
Nathan McBride: So so the, so the okay, how do we bring this back to center?
433
00:50:06.774 --> 00:50:10.340
Nathan McBride: Mike took us off a little bit there on side note.
434
00:50:10.340 --> 00:50:14.790
Mike Crispin: I was just. I was just telling Nate. I don't think AI is just a clever tool anymore.
435
00:50:15.310 --> 00:50:16.480
Nathan McBride: Covered toy.
436
00:50:16.480 --> 00:50:21.839
Mike Crispin: A clever toy. I think there's way more. There's way, more applications now to the just being kind of a clever thing.
437
00:50:21.840 --> 00:50:23.519
Kevin Dushney: I I do agree with that.
438
00:50:23.640 --> 00:50:28.789
Kevin Dushney: especially in the last 6 months, but doesn't mean it's, you know, anywhere near perfect.
439
00:50:29.220 --> 00:50:29.920
Mike Crispin: By no means.
440
00:50:30.550 --> 00:50:34.929
Kevin Dushney: I joined late. But I'm I'm gleaning. That's the the point Nate was trying to make.
441
00:50:35.140 --> 00:50:36.329
Mike Crispin: Would would you? Would you?
442
00:50:36.330 --> 00:50:41.049
Nathan McBride: For example, Mike, where you can tell me that generative AI actually automates this thing.
443
00:50:43.130 --> 00:50:46.390
Nathan McBride: Generative AI AI automates a process.
444
00:50:48.000 --> 00:50:52.320
Mike Crispin: Alright. How about I? I need a report written about something.
445
00:50:53.420 --> 00:50:54.330
Nathan McBride: Okay.
446
00:50:54.490 --> 00:50:55.050
Mike Crispin: Yep.
447
00:50:55.450 --> 00:50:59.340
Mike Crispin: I would have to write that with my own hands. I'd have to do the research I'd have to look up all
448
00:50:59.340 --> 00:51:00.280
Mike Crispin: the sources have had.
449
00:51:00.280 --> 00:51:01.410
Mike Crispin: That's automation.
450
00:51:01.410 --> 00:51:04.339
Nathan McBride: That's that's a clever toy. Automation is.
451
00:51:04.340 --> 00:51:05.190
Mike Crispin: Why is it?
452
00:51:05.190 --> 00:51:11.400
Nathan McBride: Hey, generative AI! Go all the way into air table and add this entry, and then, when you're done, go to select.
453
00:51:11.400 --> 00:51:12.740
Mike Crispin: You mean? You mean to say.
454
00:51:12.740 --> 00:51:15.570
Nathan McBride: Always make this happen every time this catalyst occurs.
455
00:51:15.570 --> 00:51:18.080
Mike Crispin: So whenever you and I use AI,
456
00:51:18.660 --> 00:51:28.170
Mike Crispin: you know, to to do any of our work, or to speed things up, or to help us summarize something that's just clever. This we don't really need, that. We're just a clever thing.
457
00:51:28.170 --> 00:51:29.069
Nathan McBride: Time, saver.
458
00:51:29.550 --> 00:51:34.840
Kevin Dushney: It's incremental right? It's not transformational. That's always. That's always how I pitch it internally.
459
00:51:35.090 --> 00:51:35.440
Nathan McBride: Yep.
460
00:51:35.440 --> 00:51:36.840
Kevin Dushney: Incremental improvements.
461
00:51:37.380 --> 00:51:42.040
Nathan McBride: I have saved myself an hour by having this thing here. Do this.
462
00:51:42.040 --> 00:51:42.500
Kevin Dushney: Yes.
463
00:51:42.500 --> 00:51:43.589
Nathan McBride: Okay, one time.
464
00:51:43.590 --> 00:51:44.280
Kevin Dushney: Don't take me.
465
00:51:44.610 --> 00:51:45.690
Mike Crispin: And it's automated.
466
00:51:46.950 --> 00:51:53.429
Nathan McBride: It's not, automated. Mike. Okay, I can make gems. I can click a button and have it do something. That's a clever version of automation.
467
00:51:53.430 --> 00:51:56.050
Mike Crispin: So when it puts together that report, it's not.
468
00:51:56.050 --> 00:51:57.090
Kevin Dushney: An automation example.
469
00:51:57.090 --> 00:51:58.110
Mike Crispin: It yourself.
470
00:52:00.510 --> 00:52:13.630
Kevin Dushney: I have an example. It's a good data scraper. So we've been playing with this internally around clint trials.gov, where using the Apis, you know, scrape the data for this particular indication
471
00:52:14.357 --> 00:52:20.920
Kevin Dushney: disease area. What phase it's in and return the results and repeat that task daily.
472
00:52:23.530 --> 00:52:31.109
Kevin Dushney: That's something the regulatory group is asking for. So is it transformational? No, no, it's automated, and it is helpful.
473
00:52:31.710 --> 00:52:36.610
Kevin Dushney: But you know, it's it's just sort of it's an example of what you can do. But
474
00:52:36.750 --> 00:52:42.719
Kevin Dushney: is it best thing since sliced bread? No, but it's it's a direct application of a a piece of automation
475
00:52:42.890 --> 00:52:45.530
Kevin Dushney: and something they'd have to do manually on their own.
476
00:52:45.880 --> 00:52:49.950
Nathan McBride: But Kevin, at the end of this, does a human being have to Qc. The data that could.
477
00:52:49.950 --> 00:52:51.640
Kevin Dushney: Yes, 100%.
478
00:52:51.770 --> 00:52:53.240
Nathan McBride: Yeah. Good.
479
00:52:53.240 --> 00:52:53.640
Kevin Dushney: Liberty.
480
00:52:53.640 --> 00:52:55.710
Nathan McBride: Toy. It's a summarization toy.
481
00:52:55.710 --> 00:52:56.100
Kevin Dushney: Alright!
482
00:52:56.100 --> 00:52:59.869
Nathan McBride: I'm done, Mike, you can argue against me. Tell me I'm gonna ask.
483
00:52:59.870 --> 00:53:00.390
Mike Crispin: I'm I'm.
484
00:53:00.670 --> 00:53:11.899
Kevin Dushney: I don't think the Qc. Part goes away anytime in the near future. It's it still doesn't mean it's not a time saver, though, because if it gets to the point where the human can. Qc. It's like medical writing.
485
00:53:12.210 --> 00:53:20.830
Kevin Dushney: but it can do a pretty good job and get you. Let's say it gives you 70% there versus starting de novo, and that human can Qc. And clean up the rest
486
00:53:21.110 --> 00:53:21.940
Kevin Dushney: that
487
00:53:22.060 --> 00:53:27.440
Kevin Dushney: I don't know. That's a force multiplier. I think over time, especially with things like a, you know, regulatory submission.
488
00:53:27.800 --> 00:53:34.300
Nathan McBride: 100%. However, devils advocate you and say.
489
00:53:34.660 --> 00:53:40.110
Nathan McBride: that's every platform is promising to save time and team
490
00:53:40.440 --> 00:53:46.720
Nathan McBride: sharepoint onedrive slack, they're all saying, speed it up and be better
491
00:53:46.920 --> 00:53:54.470
Nathan McBride: using our platform. That's the that's the plug, right? That's the universal. Why, my shit's better than yours. Pitch
492
00:53:55.140 --> 00:53:56.360
Nathan McBride: because I'm faster.
493
00:53:56.960 --> 00:53:57.340
Kevin Dushney: Yeah.
494
00:53:57.340 --> 00:54:04.059
Mike Crispin: Sure and true of true true, every to all systems. Right? I mean, that's how they're gonna pitch to you.
495
00:54:04.250 --> 00:54:05.670
Nathan McBride: But their convenience.
496
00:54:05.860 --> 00:54:14.050
Kevin Dushney: Until eventually it hits the bottleneck, which I know this is this is like, I know, that's your thesis right now is it's gonna hit a bottleneck, and I don't disagree with that.
497
00:54:14.050 --> 00:54:29.040
Nathan McBride: Well, they all. They all go through clever waves and Gen. AI, and riding the hype of a clever wave. Eventually people will realize. Yes, you still can't automate. There's still a human that's required, and blah blah and all that stuff, anyway.
498
00:54:29.910 --> 00:54:33.119
Nathan McBride: in the in the spirit of go ahead.
499
00:54:33.896 --> 00:54:38.190
Mike Crispin: I was just gonna say, I think there are other areas besides Gen. AI, where
500
00:54:38.490 --> 00:54:46.409
Mike Crispin: there's AI innovation that's happening. That's that's all I'm saying, I think Gen. AI is what is most approachable and accessible.
501
00:54:47.125 --> 00:54:54.059
Mike Crispin: But you know, robotics and autonomous driving, and all that, that's all AI and data driven. And it's automated.
502
00:54:54.590 --> 00:54:56.059
Mike Crispin: And it's a game changer.
503
00:54:56.420 --> 00:54:58.370
Mike Crispin: Why, yeah, that wouldn't be happening.
504
00:54:58.630 --> 00:55:00.180
Nathan McBride: Why do you have AI cars, Mike?
505
00:55:00.893 --> 00:55:02.529
Mike Crispin: You will. I think I don't think.
506
00:55:02.530 --> 00:55:09.680
Nathan McBride: Why, why would we have? Why would we have self driving cars? What's the number? One reason why we've been through this? I'm gonna ask you again, why would we help self driving cars.
507
00:55:09.680 --> 00:55:10.570
Mike Crispin: Safety.
508
00:55:10.950 --> 00:55:12.290
Nathan McBride: To get drunk people home.
509
00:55:13.010 --> 00:55:17.519
Mike Crispin: Safety now to improve traffic safety. Yeah, drunk people. Home is another reason.
510
00:55:17.520 --> 00:55:44.919
Nathan McBride: You can't do it for safety's sake, because they've done the red ball theory they've tested in New York City a thousand times. It fails a hundred percent of the time. You cannot do the red until AI is able to pass the red ball theory, which will probably be never. You cannot use AI as a litmus test for improving safety, because it takes one person to throw a red ball in front of an AI car to stop traffic around all of midtown Manhattan
511
00:55:45.430 --> 00:55:47.339
Nathan McBride: bring it to a grinding halt, because if.
512
00:55:47.340 --> 00:55:51.130
Mike Crispin: What would a norm, what would a regular person do if someone threw a red ball in front of their car.
513
00:55:51.510 --> 00:55:52.700
Nathan McBride: I don't know. But what do you.
514
00:55:52.700 --> 00:55:53.300
Mike Crispin: Stop.
515
00:55:53.800 --> 00:55:54.419
Nathan McBride: They run it over.
516
00:55:54.420 --> 00:55:56.239
Nathan McBride: What's the what's the what's the improvement.
517
00:55:56.240 --> 00:55:59.180
Kevin Dushney: Nate would run it over, and so would you, Mike?
518
00:55:59.180 --> 00:56:07.969
Mike Crispin: Yeah, I I think that's that's I. I think we would never innovate. If that's the the meter in which you held things to, I think we just stop and not even try.
519
00:56:07.970 --> 00:56:16.200
Nathan McBride: Innovate. My point is, that's why you won't see in our lifetime new York City. Turn into all autonomous vehicles.
520
00:56:16.200 --> 00:56:19.599
Mike Crispin: Perhaps not. No, but Minnesota, maybe
521
00:56:19.760 --> 00:56:26.040
Mike Crispin: Minnesota, perhaps New Orleans, Florida, New Orleans. Wherever pony potholes.
522
00:56:26.270 --> 00:56:26.630
Kevin Dushney: Yeah.
523
00:56:26.630 --> 00:56:31.420
Mike Crispin: But that's what I mean. Like gradual, gradual state drug discovery is another area where this.
524
00:56:31.420 --> 00:56:33.320
Nathan McBride: Lost a lot growing targets.
525
00:56:33.650 --> 00:56:39.930
Mike Crispin: Fraud prevention and a lot of the cyber security space. AI is helping to move against that. I think that's another area.
526
00:56:40.430 --> 00:56:44.060
Mike Crispin: I mean, it's I mean, yeah, none of it's all done, set and done, but
527
00:56:44.380 --> 00:56:56.020
Mike Crispin: we'll all be using it in 6 months and be saying, Oh, jeez, it actually is works. I'm not saying right now. It's perfect. But, Kevin, what I did is I made a leap, and you probably disagree with this, but.
528
00:56:56.760 --> 00:56:57.590
Nathan McBride: Believe he did.
529
00:56:57.590 --> 00:57:00.520
Mike Crispin: I did. Yeah, I think this is bigger.
530
00:57:00.910 --> 00:57:06.619
Mike Crispin: Then the things that we've been all excited about you know that the impact of this will be bigger than
531
00:57:07.600 --> 00:57:12.290
Mike Crispin: what were we excited about open source vmware cloud.
532
00:57:12.590 --> 00:57:20.029
Mike Crispin: certain data, analytics, tools, these big things we went to Gartner and got excited about. I I think this is going to eclipse all those things. And right now.
533
00:57:20.030 --> 00:57:22.370
Kevin Dushney: I thought we were talking about virtualization tonight.
534
00:57:22.370 --> 00:57:23.269
Mike Crispin: It's not there yet.
535
00:57:23.270 --> 00:57:24.880
Mike Crispin: You didn't say why. It's not
536
00:57:24.880 --> 00:57:27.150
Mike Crispin: there yet, but I think it will get there.
537
00:57:27.450 --> 00:57:28.570
Nathan McBride: He didn't say why.
538
00:57:30.060 --> 00:57:30.940
Mike Crispin: Why?
539
00:57:31.100 --> 00:57:31.660
Nathan McBride: Yeah.
540
00:57:32.836 --> 00:57:38.719
Mike Crispin: Well, I would say, adoption is the fastest of any product that's ever existed. It's given access.
541
00:57:38.720 --> 00:57:40.370
Nathan McBride: How do you define adoption of AI.
542
00:57:40.370 --> 00:57:44.779
Mike Crispin: Just take chat. Gbt, if you want to go there, it's fastest growing app of all time.
543
00:57:44.960 --> 00:57:45.530
Mike Crispin: So I have.
544
00:57:45.530 --> 00:57:46.190
Mike Crispin: I think
545
00:57:47.040 --> 00:57:54.170
Mike Crispin: I think Gemini hallucinated the script. I thought we were talking about Vmware and virtualizing my physical servers tonight on the podcast
546
00:57:54.170 --> 00:57:54.840
Mike Crispin: stop.
547
00:57:55.390 --> 00:57:57.650
Kevin Dushney: Oh, so that was last week.
548
00:57:58.550 --> 00:57:59.919
Mike Crispin: That was last week.
549
00:58:00.130 --> 00:58:01.870
Kevin Dushney: I was on last week.
550
00:58:02.838 --> 00:58:05.121
Mike Crispin: That's where you were looking to last week's script. Then.
551
00:58:07.700 --> 00:58:08.500
Nathan McBride: Okay.
552
00:58:08.500 --> 00:58:09.549
Kevin Dushney: Anyway. Go on. Sorry.
553
00:58:09.550 --> 00:58:17.189
Mike Crispin: No, this is great. No, I was just saying I was just saying, I think that's that's what got us into our debate when we when we when you came on?
554
00:58:17.190 --> 00:58:17.645
Mike Crispin: Okay.
555
00:58:20.030 --> 00:58:27.439
Nathan McBride: I mean all things that we can tackle in time, but they're they're all 0 sum arguments, which is to say
556
00:58:27.710 --> 00:58:34.369
Nathan McBride: honestly, Are you for or against AI. But you might as well argue against for or against power,
557
00:58:35.920 --> 00:58:39.133
Nathan McBride: or or or or oxygen. There's no point. It's
558
00:58:39.490 --> 00:58:40.070
Mike Crispin: Yep.
559
00:58:40.070 --> 00:58:44.760
Nathan McBride: It's a mechanism that exists at the moment to theoretically.
560
00:58:45.280 --> 00:59:02.680
Nathan McBride: theoretically improve everyone's lives. But no one actually has any fucking concept as to how it actually improves anyone's lives. So it's a it's a construct of cleverness to say, if you use this, it will make things better. But no one understands the relative
561
00:59:03.170 --> 00:59:12.879
Nathan McBride: basic definition of what better means better than it was. It is today will be tomorrow. There's no delineation. It's just this thing that keeps growing
562
00:59:13.500 --> 00:59:16.090
Nathan McBride: based on its own clever merits.
563
00:59:16.460 --> 00:59:21.560
Nathan McBride: So to Kevin's point, force multiplier it must be. But even then.
564
00:59:23.280 --> 00:59:26.140
Nathan McBride: I mean, unless you have a baseline
565
00:59:27.010 --> 00:59:35.700
Nathan McBride: fucking anything to be a force multiplier. I could wear a catcher's mitt to work, and because I wear it, I'm more efficient at doing one thing is that a force multiplier.
566
00:59:36.920 --> 00:59:37.620
Kevin Dushney: No.
567
00:59:38.090 --> 00:59:46.250
Nathan McBride: But anyway, and that's. And that's about as much I know is about baseball. So it's pretty spirit of
568
00:59:46.980 --> 00:59:50.950
Nathan McBride: of this podcast being a relatively timely podcast.
569
00:59:50.950 --> 00:59:51.550
Mike Crispin: Sure.
570
00:59:52.400 --> 00:59:58.579
Nathan McBride: We will move forward. But, Mike, we will come back to these things. I'm not letting off the hook that easy, so.
571
01:00:00.080 --> 01:00:02.189
Mike Crispin: You can't. I'd love to talk more about it.
572
01:00:02.950 --> 01:00:06.039
Nathan McBride: Okay? Well, you'll have some more. You'll have an opportunity in a moment to.
573
01:00:06.040 --> 01:00:06.760
Mike Crispin: Okay.
574
01:00:06.760 --> 01:00:07.389
Kevin Dushney: That's it.
575
01:00:07.940 --> 01:00:10.730
Mike Crispin: Talk about this so sure, unless you have something else to say.
576
01:00:11.090 --> 01:00:14.820
Mike Crispin: No, no, I'll hold I'll hold on to it. You hold on to it.
577
01:00:15.172 --> 01:00:16.579
Kevin Dushney: We save it. Yeah.
578
01:00:16.580 --> 01:00:17.360
Nathan McBride: Glitch it.
579
01:00:17.900 --> 01:00:18.650
Mike Crispin: I'm clinching them.
580
01:00:18.650 --> 01:00:19.950
Nathan McBride: Walk it in.
581
01:00:20.580 --> 01:00:21.100
Mike Crispin: Oh!
582
01:00:21.540 --> 01:00:35.220
Nathan McBride: So before. I don't remember where we were talking about. But at some point in time a while ago, we were talking about AI. And I wanted to cover the idea of spender relationship lock-in, which is that traditional lock-in. As we know it.
583
01:00:35.300 --> 01:00:55.900
Nathan McBride: Mike likes to refer to Gartner. But really any lock-in is that you're going to have proprietary file formats. For instance, we use Lucidchart in the past. Lucidchart uses Lcf documents that's proprietary can't use those anywhere else while I'm closed, Apis or restrictive licensing. But no matter what, it's a lot of 0 based decision making.
584
01:00:56.000 --> 01:01:00.589
Nathan McBride: So if you buy this vendor, you're going to be living that vendor. You think about it.
585
01:01:00.780 --> 01:01:16.389
Nathan McBride: AI, whether it's an AI vendor or a vendor who's using the collaborative AI to create an AI layer. You're creating model lock-in, which is harder to recognize and mitigate. So yeah, I'm in like exilio is a anthropic and Gemini company.
586
01:01:17.271 --> 01:01:20.219
Nathan McBride: Does that mean I'm locked in. Well.
587
01:01:20.500 --> 01:01:32.550
Nathan McBride: yes and no. I'm not locked into the point that my people know how to write prompts for any platform they've been trained on this. They're very good at it. But am I locked in on pushing them towards a deep research world
588
01:01:32.780 --> 01:01:49.620
Nathan McBride: for Gemini? Yes, a decision I've had to consciously make, and I wrestled with every single day, but it still seems like the best proposition for the moment to get us forward. But here's how it generally works using AI platform vendor services. The AI learns from you.
589
01:01:51.166 --> 01:01:58.529
Nathan McBride: It becomes increasingly tailored to your needs and patterns. If you're a co-pilot company and you're deep in the co-pilot. You've already learned this.
590
01:01:58.650 --> 01:02:02.939
Nathan McBride: This creates value, but it also means that switching vendors becomes increasingly costly.
591
01:02:03.540 --> 01:02:10.980
Nathan McBride: not just in terms of technical migration, but in terms of learning or lost learning and repersonalization.
592
01:02:11.210 --> 01:02:13.669
Nathan McBride: This is not the case with collaborative AI.
593
01:02:13.910 --> 01:02:17.599
Nathan McBride: Whoever of AI learns learns from the explicit data it's given.
594
01:02:18.990 --> 01:02:22.990
Kevin Dushney: I agree on the personalization. But I would challenge the the learning piece.
595
01:02:23.500 --> 01:02:33.079
Kevin Dushney: because my view is that the skills you learn in prompting are are pretty. Trans between platforms. Yeah, you lose the learning.
596
01:02:33.260 --> 01:02:38.770
Kevin Dushney: But your co-pilot example there really is no learning, so
597
01:02:39.810 --> 01:02:44.580
Kevin Dushney: I don't. I don't think that one applies there. Chat Gpt, for sure, because it actively builds a memory
598
01:02:45.380 --> 01:02:56.100
Kevin Dushney: and cloud. I don't know if it's changed, but it's within the same prompt right or the same conversation, I should say versus a persistent memory behind the scenes which I think.
599
01:02:56.240 --> 01:02:57.990
Kevin Dushney: leaving Chat Gpt
600
01:02:58.230 --> 01:03:05.640
Kevin Dushney: after you've built up that persistent memory over time to me that would be a a bigger barrier to lock in than some of the other things. But
601
01:03:05.790 --> 01:03:07.330
Kevin Dushney: anyway, just to interject.
602
01:03:07.330 --> 01:03:12.430
Nathan McBride: No, no, it's a good point. Let me add another sentence. Then there's a context awareness of
603
01:03:12.900 --> 01:03:15.880
Nathan McBride: if I have, say, 10 million of a thing.
604
01:03:16.020 --> 01:03:20.749
Nathan McBride: And I train the system on that 10 million of that thing.
605
01:03:20.840 --> 01:03:47.379
Nathan McBride: And it's it's finally tuned. Do metadata, detection or context awareness of how the business runs and knows what to do, and then I have to train a new model to do the exact same thing. I lost all that initial training now it probably wouldn't take very long due to the cleverness of the toy that I'd be picking, but it would take me very long to get a new system up and running, but still I lose the knowledge that it's already been built around.
606
01:03:47.380 --> 01:03:47.990
Kevin Dushney: Yeah.
607
01:03:48.150 --> 01:03:50.180
Nathan McBride: Around a context awareness. And
608
01:03:51.040 --> 01:03:57.210
Nathan McBride: this is a conversation that's come up a few times recently with me in various use cases. But in truth.
609
01:03:57.800 --> 01:04:04.179
Nathan McBride: there's a lot of thinking about. And I'm challenging a lot of my vendors on this especially box
610
01:04:04.290 --> 01:04:06.519
Nathan McBride: about you. Tell me
611
01:04:07.010 --> 01:04:19.769
Nathan McBride: that for 20 million documents you can build context awareness without having any metadata in place, which is a clean box makes. But how do you know what confidential means?
612
01:04:20.820 --> 01:04:33.470
Nathan McBride: And no one's able to answer that question like, if I have a document that's called confidential, and I have another document says the word confidential in it, and I have a 3rd document that's got neither the title or the word, but is extremely confidential. How do you know?
613
01:04:34.570 --> 01:04:40.650
Nathan McBride: Yeah, that's that's where no tagging or no metadata breaks, that breaks down without an absence of that.
614
01:04:41.330 --> 01:04:48.020
Mike Crispin: Like you said before, Nate. With them, I think I was. Part of that conversation is that they need context. They need some context, right?
615
01:04:48.020 --> 01:04:48.470
Mike Crispin: It doesn't.
616
01:04:48.470 --> 01:04:49.679
Nathan McBride: You can buy this.
617
01:04:49.680 --> 01:04:51.199
Mike Crispin: They can't do it from scratch.
618
01:04:51.650 --> 01:04:56.149
Mike Crispin: But doesn't they? They need context right to to be able to even start.
619
01:04:56.300 --> 01:04:59.361
Mike Crispin: They can't start from 0. They can't start from 0 absolutely.
620
01:04:59.640 --> 01:05:05.299
Nathan McBride: So that's the fine tuning element I'm referring to here, which when you start to fine, tune a platform
621
01:05:05.898 --> 01:05:08.129
Nathan McBride: and I think you know we like
622
01:05:08.360 --> 01:05:20.750
Nathan McBride: all these big platforms, the boxes, dropboxes, ignites. They all run on either Openai or Claude, or some variation of the 2. You're fine tuning it over time and again. To change is very. It's very
623
01:05:20.980 --> 01:05:37.289
Nathan McBride: probably straightforward today. A year from now might be a little bit more painful or a lot more painful, but eventually it will also be starting over at a lower performance level. So oh, well, we're not having good luck with this particular vendor. We need to change AI vendors.
624
01:05:38.000 --> 01:05:40.259
Nathan McBride: And then you're like, Okay, well.
625
01:05:40.590 --> 01:05:43.299
Nathan McBride: there's not like a Json file we can export
626
01:05:44.410 --> 01:05:52.519
Nathan McBride: like, or a or something that we can use a training. Csv file. We have to start from fucking scratch.
627
01:05:52.830 --> 01:05:58.139
Nathan McBride: Yes, and that it's a new autonomy challenge for it. Leaders that you know, we haven't really
628
01:05:58.350 --> 01:06:03.390
Nathan McBride: belt right now. It will. It'll be a while before we truly feel it.
629
01:06:04.060 --> 01:06:08.130
Nathan McBride: because how hard is it to plug a new Gen. AI. Agent on top of your shit and have it.
630
01:06:08.130 --> 01:06:10.240
Nathan McBride: Yeah, start doing implicit learning.
631
01:06:10.610 --> 01:06:11.270
Nathan McBride: But
632
01:06:12.880 --> 01:06:22.710
Kevin Dushney: So I I think for you in the context, here is user employee to AI interactions. But what if you built a workflow
633
01:06:22.990 --> 01:06:29.740
Kevin Dushney: with Apis. And to me, that's the bigger problem is, if you've built a whole workflow on top of this.
634
01:06:30.220 --> 01:06:38.000
Kevin Dushney: Now that's part of your stack. And how do you swap that out, and how easy is it that to do, you know, to do? It's like, say, I want to move from open AI to
635
01:06:38.290 --> 01:06:43.910
Kevin Dushney: to, you know Gemini or Claude, and use a completely different set of Apis to drive the workflow.
636
01:06:44.230 --> 01:06:44.640
Nathan McBride: Yeah.
637
01:06:44.640 --> 01:06:51.000
Kevin Dushney: To me. That's a a huge lift compared to like alright. I gotta retrain my employee base on using a new model.
638
01:06:52.000 --> 01:06:58.309
Kevin Dushney: They're both problematic. But to me that you know, once it becomes integrated in part of your model, like almost like Middleware.
639
01:06:59.010 --> 01:07:03.390
Kevin Dushney: that's where the lock in really starts to become potentially painful if you have to swap that out.
640
01:07:05.270 --> 01:07:06.479
Nathan McBride: 100%. And that's.
641
01:07:06.480 --> 01:07:06.860
Mike Crispin: If.
642
01:07:06.860 --> 01:07:12.180
Nathan McBride: A a conversation that's really maybe not happening, or it's very nascent in companies.
643
01:07:12.180 --> 01:07:12.890
Kevin Dushney: Yes.
644
01:07:12.890 --> 01:07:17.420
Nathan McBride: How to build in a model about how you built your model.
645
01:07:18.760 --> 01:07:30.020
Nathan McBride: It's not like all the way back to like agile basic development type stuff. But if I'm gonna go ahead and put in a particular Gen. A model, and Llm. And and fine, tune it, and explicit level of searching
646
01:07:30.560 --> 01:07:36.429
Nathan McBride: like, how is it documented, and how is it replicatable? And should I be building systems that are truly portable?
647
01:07:37.020 --> 01:07:42.500
Nathan McBride: Or if I'm going all in with a vendor, am I just biting the bullet and saying, I hope I never leave this vendor.
648
01:07:44.805 --> 01:07:46.320
Nathan McBride: Mike, go ahead.
649
01:07:47.320 --> 01:07:57.439
Mike Crispin: To say, I agree, especially as these mo- mostly to the point Kevin was making on the the chat Gpt component, because I think that's the most accurate.
650
01:07:57.440 --> 01:07:57.900
Kevin Dushney: Yeah.
651
01:07:57.900 --> 01:08:06.409
Mike Crispin: Way that these models will grow is more an assistant model from a Gen. AI perspective. Pretty much is that
652
01:08:06.600 --> 01:08:08.600
Mike Crispin: you're gonna have the ability
653
01:08:09.700 --> 01:08:18.730
Mike Crispin: to perhaps bring your Gen. AI model from one place to another, and that's gonna be a whole other challenge for us. And it once we get to that point. But for now
654
01:08:18.920 --> 01:08:22.520
Mike Crispin: the enterprise complete controls the AI.
655
01:08:22.750 --> 01:08:24.010
Mike Crispin: And I think
656
01:08:24.380 --> 01:08:31.189
Mike Crispin: that's gonna become a core part of how people work. If, as at a. It owns the AI platform.
657
01:08:31.529 --> 01:08:38.770
Mike Crispin: and to switch it out is gonna be once it's learned who you are is gonna be painful, more painful than any other.
658
01:08:38.880 --> 01:08:50.219
Mike Crispin: I think thing that we've experienced because you've built a almost if it's successful, you've built a relationship almost with the AI over time. And that's gonna be impossible to rip out.
659
01:08:50.569 --> 01:08:51.349
Nathan McBride: Let me throw this boat.
660
01:08:51.350 --> 01:08:53.410
Mike Crispin: It's gonna be like firing someone.
661
01:08:54.149 --> 01:08:56.639
Nathan McBride: Yeah. Well, speaking of that note
662
01:08:58.350 --> 01:09:06.349
Nathan McBride: here's a softball throw to you both that I've been thinking about, which is, let's suppose that there is a tenured
663
01:09:06.509 --> 01:09:09.949
Nathan McBride: senior director of bioengineering.
664
01:09:10.420 --> 01:09:11.010
Mike Crispin: Yep.
665
01:09:11.510 --> 01:09:13.750
Nathan McBride: 15 years!
666
01:09:14.090 --> 01:09:16.399
Nathan McBride: Double Phd. Genius!
667
01:09:16.720 --> 01:09:19.039
Nathan McBride: And he's coming to your company.
668
01:09:19.660 --> 01:09:21.930
Nathan McBride: And this is the year 2027,
669
01:09:22.620 --> 01:09:25.689
Nathan McBride: and this person has got a wealth of knowledge.
670
01:09:25.990 --> 01:09:30.200
Nathan McBride: years and years of development. He comes in your company and guess what? He's an infant
671
01:09:30.590 --> 01:09:38.899
Nathan McBride: because he's got no fucking way to port all of his AI work into your system.
672
01:09:39.450 --> 01:09:42.870
Nathan McBride: Yeah. So he starts in your company, and your company is like, what's your name?
673
01:09:43.050 --> 01:09:48.419
Nathan McBride: He's like, Jim, Jim, what do you do? It has to learn from 0.
674
01:09:49.359 --> 01:09:49.969
Mike Crispin: Yup!
675
01:09:50.229 --> 01:09:56.259
Nathan McBride: Because Jim can't port his world to your company in terms of AI. He has to start from scratch.
676
01:09:56.890 --> 01:09:57.610
Mike Crispin: And that, and that.
677
01:09:57.610 --> 01:10:00.759
Nathan McBride: Your environment has to learn him. How do you combat that.
678
01:10:00.760 --> 01:10:06.349
Mike Crispin: That is exactly why I think we it won't be anytime soon, but you'll bring your AI with you.
679
01:10:07.309 --> 01:10:07.969
Nathan McBride: Okay.
680
01:10:08.299 --> 01:10:20.819
Mike Crispin: It opens a whole new can of worms and a whole bunch of IP issues and everything else. But you're not. Gonna there may come a time, and I know we're talking far into the future, where
681
01:10:20.820 --> 01:10:21.240
Mike Crispin: so.
682
01:10:21.240 --> 01:10:22.420
Nathan McBride: No one's gonna come, and we're.
683
01:10:22.420 --> 01:10:26.779
Mike Crispin: No one's gonna come work for you if you don't let you them use the AI that they've used before.
684
01:10:26.780 --> 01:10:29.340
Mike Crispin: Yeah. But how do they? How do they export it, Mike? Like they don't.
685
01:10:29.340 --> 01:10:29.750
Kevin Dushney: I.
686
01:10:29.750 --> 01:10:31.740
Mike Crispin: They bring it, they bring it with them, they own it.
687
01:10:31.740 --> 01:10:33.499
Kevin Dushney: So I think this is an.
688
01:10:33.500 --> 01:10:34.139
Mike Crispin: That I did.
689
01:10:34.140 --> 01:10:40.020
Kevin Dushney: Use your personal account and a disincentive to use the Company's enterprise account.
690
01:10:40.020 --> 01:10:40.740
Mike Crispin: Yes.
691
01:10:42.150 --> 01:10:43.269
Kevin Dushney: Because I mean, this is all
692
01:10:43.270 --> 01:10:51.989
Kevin Dushney: in your personal chat. Gbt, or whatever your tool of choice is, it knows you. And you're gonna want to plug in immediately and get started versus upgrade.
693
01:10:51.990 --> 01:10:52.770
Mike Crispin: Yes, yes.
694
01:10:52.770 --> 01:11:01.300
Kevin Dushney: I gotta go retrain a brand new model. Over the last 5 years I've been interacting with this other thing in my personal life, and it knows everything about me that I wanted to know.
695
01:11:01.610 --> 01:11:09.650
Mike Crispin: Or or it's I've been inter. I've been interacting it at another company, and they let me use it. And I I have it built.
696
01:11:10.050 --> 01:11:10.790
Nathan McBride: Okay.
697
01:11:11.170 --> 01:11:17.739
Mike Crispin: I mean, it's a it's a huge conundrum that we could discuss. I don't know what if it will actually happen, but I can foresee that
698
01:11:17.970 --> 01:11:19.020
Mike Crispin: being a real.
699
01:11:19.020 --> 01:11:22.480
Nathan McBride: You can have a part 3 to this episode. I don't give a shit. It's our podcast.
700
01:11:23.230 --> 01:11:27.259
Nathan McBride: Here's the point, you're both absolutely correct and
701
01:11:27.700 --> 01:11:31.030
Nathan McBride: and trust me when I'm thinking, when I'm out there running, and I'm like
702
01:11:31.480 --> 01:11:34.440
Nathan McBride: holy shit like if I hire
703
01:11:34.660 --> 01:11:40.529
Nathan McBride: somebody who's really really critical for a company, and they say but where I threw my AI model with me.
704
01:11:41.140 --> 01:11:43.890
Nathan McBride: This happened to me yesterday morning. I'm like.
705
01:11:44.270 --> 01:11:47.739
Nathan McBride: Oh, shit! I got no way to import it. I got nothing.
706
01:11:47.740 --> 01:11:48.290
Mike Crispin: Yeah.
707
01:11:48.520 --> 01:11:54.420
Nathan McBride: And this is right around the corner. This person's gonna come to me and be like, I'm not working for you.
708
01:11:54.790 --> 01:11:57.809
Nathan McBride: I can't. I have no way to bring my engine into your company.
709
01:11:58.140 --> 01:11:59.149
Kevin Dushney: Also, if you.
710
01:11:59.150 --> 01:12:08.690
Kevin Dushney: if you could import it, what are you bringing in? I mean, this is, this is much bigger than someone coming with a portable hard drive from their previous company and saying, You know, I want.
711
01:12:09.090 --> 01:12:11.330
Kevin Dushney: I want to put this data on my laptop.
712
01:12:12.000 --> 01:12:15.819
Nathan McBride: Well, that's it. That's a great question. I mean, ultimately.
713
01:12:16.794 --> 01:12:23.810
Nathan McBride: Mike Crispin in 2030 is, gonna go to a company and be like. No, no, it's cool. I got it.
714
01:12:23.930 --> 01:12:26.869
Nathan McBride: And Mike's gonna just plug in this little black box.
715
01:12:27.250 --> 01:12:30.339
Nathan McBride: He's like this is, this is my alt mic.
716
01:12:30.620 --> 01:12:33.559
Nathan McBride: and he goes wherever I go.
717
01:12:33.920 --> 01:12:38.039
Mike Crispin: I mean, I think it's it's a an extension of the problem that
718
01:12:38.990 --> 01:12:42.980
Mike Crispin: we are allowed to bring our own brains from one company to another.
719
01:12:43.330 --> 01:13:01.700
Mike Crispin: And it's this is an extension of your brain. And just like our phones, we can manage those we can build walls in those. And once we? And we can build Mdm. Profiles and other stuff to help segment data and all that good stuff. When you start talking about an extension of a conversation and a piece of software that
720
01:13:01.930 --> 01:13:03.780
Mike Crispin: I can hold in my hand.
721
01:13:03.880 --> 01:13:09.370
Mike Crispin: where I can have implanted in my brain to go. Really, Sci-fi, you know, is
722
01:13:09.830 --> 01:13:18.939
Mike Crispin: you. You're not gonna be able to protect against that stuff, or there's gonna be a whole other business or whole other model in which to shield those things, or
723
01:13:19.669 --> 01:13:28.999
Mike Crispin: we were going to talk about quantum and code breaking and other stuff. And it's like, how much does it matter if we're going to be able to carry everything we've ever known around with us.
724
01:13:30.230 --> 01:13:30.690
Nathan McBride: Nope.
725
01:13:30.690 --> 01:13:31.020
Kevin Dushney: Let's.
726
01:13:31.020 --> 01:13:33.759
Mike Crispin: Right, how much does it matter? Because
727
01:13:34.300 --> 01:13:39.680
Mike Crispin: maybe it won't be if if someone has, it won't be. If someone has the IP anymore.
728
01:13:39.840 --> 01:13:43.200
Mike Crispin: it'll be if someone has the same custom.
729
01:13:43.570 --> 01:13:47.730
Mike Crispin: AI, that you've built or not.
730
01:13:47.950 --> 01:13:57.850
Mike Crispin: The data doesn't matter as much anymore, because it takes the complex AI, you've written or programmed into yourself or into your personal or into your job as a, as a useful employee
731
01:13:58.720 --> 01:14:02.709
Mike Crispin: to actually make a product that makes sense.
732
01:14:02.900 --> 01:14:13.520
Mike Crispin: So you'll have to have. It'll be less about the actual data. And how important that is. And more about, how can you use such a complex model for algorithm or AI basically
733
01:14:13.640 --> 01:14:16.259
Mike Crispin: to comprehend that data.
734
01:14:16.510 --> 01:14:18.999
Mike Crispin: And that's going to be what you bring with you.
735
01:14:19.560 --> 01:14:23.730
Mike Crispin: I know you're just going super black mirror here. But that's I'm just saying like, well, the question is.
736
01:14:23.730 --> 01:14:24.820
Mike Crispin: how much is it going to matter.
737
01:14:24.820 --> 01:14:28.160
Nathan McBride: Funny podcast, on it, smoke it up.
738
01:14:28.920 --> 01:14:35.679
Kevin Dushney: In black mirror. You'll you'll co-interview you, you and your alt.
739
01:14:37.092 --> 01:14:40.350
Nathan McBride: Well, I mean, so I did
740
01:14:40.350 --> 01:14:44.400
Nathan McBride: write notes on this ironically, since I'm gonna point out which is.
741
01:14:44.590 --> 01:14:46.650
Mike Crispin: This is awesome, loving, this.
742
01:14:47.050 --> 01:14:57.090
Nathan McBride: You'd have to, you'd have to as a company to again. Let's give this guy a name. This is John Quidley. Poochly pooch.
743
01:14:57.090 --> 01:14:57.700
Mike Crispin: With me.
744
01:14:58.730 --> 01:14:59.630
Nathan McBride: It comes to.
745
01:14:59.630 --> 01:14:59.950
Mike Crispin: Sorry.
746
01:14:59.950 --> 01:15:03.669
Nathan McBride: He says he says, Well, what what vendors do you use?
747
01:15:03.910 --> 01:15:06.299
Nathan McBride: You have to say? Basically, all of them.
748
01:15:07.138 --> 01:15:15.239
Nathan McBride: We allow for all vendors. So you become a vendor gender generic
749
01:15:16.116 --> 01:15:20.590
Nathan McBride: number 2, you'd have to have the ability to allow portability.
750
01:15:21.020 --> 01:15:30.930
Nathan McBride: So oh, what do you have? Oh, I have. You know John Quidley pooch has got, or Joe, or whatever I call them, Jay Quidley Pooch has got
751
01:15:31.490 --> 01:15:43.430
Nathan McBride: his whole life. AI wise on this hard drive. And so it, you know, plugs in and he's back and running. You'd have to have data continuity, so that when Jake quickly pooch leaves your company.
752
01:15:43.880 --> 01:15:47.929
Nathan McBride: There's an implicit agreement that he can take his drive with him
753
01:15:48.700 --> 01:15:52.690
Nathan McBride: and all of this shit, but the things that are relevant to the company stay somehow.
754
01:15:53.010 --> 01:15:53.690
Kevin Dushney: How.
755
01:15:54.250 --> 01:16:00.140
Nathan McBride: And no, yeah, because you'd have to be at some extreme level of data management to do that.
756
01:16:00.250 --> 01:16:00.990
Kevin Dushney: Yep.
757
01:16:02.120 --> 01:16:03.159
Nathan McBride: Oh, my God!
758
01:16:03.160 --> 01:16:05.339
Kevin Dushney: It's just not there. I don't see that happening.
759
01:16:05.340 --> 01:16:13.760
Mike Crispin: I don't think it will. I don't think it will be. I don't think it will be there. I think it. It's it's gonna be just like hiring the best people. It'll be hiring the best augmented people
760
01:16:15.200 --> 01:16:26.220
Mike Crispin: seriously, like the best people who know. And this is why I don't think the jobs are going to disappear. They're just going to change is because the people who come into the jobs will bring that artificial intelligence with them.
761
01:16:27.560 --> 01:16:31.039
Nathan McBride: Holy shit. Okay, we went way off.
762
01:16:31.040 --> 01:16:38.549
Mike Crispin: Yeah, but this. But it's interesting, because I mean, I'm not sure. Like I think if you think about the future of it, we're talking about
763
01:16:38.730 --> 01:16:44.480
Mike Crispin: showstopper stuff here like we're talking about, how do we protect data? How do we protect IP like.
764
01:16:44.690 --> 01:16:51.480
Mike Crispin: I don't know if you're gonna be able to it might be. How do you contractually protect your company more than anything else
765
01:16:51.760 --> 01:16:53.700
Mike Crispin: other than that you're you're done.
766
01:16:54.630 --> 01:16:58.660
Nathan McBride: I think. Well, so here, here's a here's a potentially plausible scenario.
767
01:17:00.070 --> 01:17:23.540
Nathan McBride: Let's suppose I'm able to put in a kick ass data management infrastructure that has this supreme ontology. Everything's classified. Yada Yada and Jake Quidley Pooch comes in as my senior bioengineer, and he's able to plug his AI Alt into my environment. He and his AI Alt are able to do the work that's been assigned to them, but they're able to plug into my Llm.
768
01:17:23.800 --> 01:17:25.970
Nathan McBride: So they're able to use their world
769
01:17:26.120 --> 01:17:34.110
Nathan McBride: and then bolted onto mine. My data does not cross over his barrier. His world is able to access my world, though not
770
01:17:34.760 --> 01:17:37.500
Nathan McBride: leak data into his world somehow.
771
01:17:38.040 --> 01:17:44.700
Nathan McBride: but therefore he is still able to do his work, so he's able to take 2 Llms. Bolt them together, allow them to coexist
772
01:17:44.860 --> 01:17:51.129
Nathan McBride: through communal language. Some sort of I don't know. Open source prefix
773
01:17:51.860 --> 01:17:57.949
Nathan McBride: and then talk to each other. And then, when he leaves, my data is still intact.
774
01:17:58.570 --> 01:18:08.670
Nathan McBride: I have been able to track and trace all of Jake quickly. Pooch's interactions with my data, determine if anything actually got exfiltrated, and then life goes on.
775
01:18:09.780 --> 01:18:11.280
Nathan McBride: Is that plausible.
776
01:18:12.080 --> 01:18:12.630
Mike Crispin: I.
777
01:18:13.350 --> 01:18:13.870
Nathan McBride: Okay.
778
01:18:14.170 --> 01:18:17.820
Mike Crispin: I don't think it's it's I don't go ahead, Kevin. Sorry.
779
01:18:18.100 --> 01:18:22.779
Kevin Dushney: I I was just I don't. I don't know. My for my initial gut reaction is just no, because
780
01:18:23.830 --> 01:18:32.299
Kevin Dushney: I I don't know how you would get that out of whatever they brought to the table, or what they're leaving with and leaving with, and how you would determine that.
781
01:18:32.810 --> 01:18:36.369
Kevin Dushney: you know, especially if, like, what if the person specifically trained.
782
01:18:36.670 --> 01:18:44.020
Kevin Dushney: you know their Lm. To say, Look, do not answer any questions about what I know related to these topics.
783
01:18:44.420 --> 01:18:47.969
Kevin Dushney: and do not violate that rule under any circumstances.
784
01:18:48.450 --> 01:18:51.519
Mike Crispin: That's just what I was about to say. Yeah, it's the same thing. Yep.
785
01:18:52.940 --> 01:18:53.740
Mike Crispin: So you.
786
01:18:53.740 --> 01:18:59.489
Kevin Dushney: You're basically, you know, training it to not answer any questions or predisposing it to be, you know.
787
01:19:00.181 --> 01:19:07.719
Kevin Dushney: an adversary essentially, when it comes to discovering if it knows anything about your IP. When that person is leaving with their with their model.
788
01:19:09.170 --> 01:19:10.129
Nathan McBride: Jesus Christ.
789
01:19:10.130 --> 01:19:11.329
Mike Crispin: It's very bad.
790
01:19:11.660 --> 01:19:12.619
Kevin Dushney: I know, but.
791
01:19:12.940 --> 01:19:14.960
Mike Crispin: Nevertheless, it it may be.
792
01:19:14.960 --> 01:19:29.960
Mike Crispin: maybe just like treating it very much like a a human that has the knowledge. Right? It's what do we do with humans that leave a company right? They leave a company, and they've got, you know, employee separation agreements. They've signed certain deals. There's no.
793
01:19:29.960 --> 01:19:31.190
Kevin Dushney: Part of their non-compete.
794
01:19:31.600 --> 01:19:37.719
Mike Crispin: Yeah, it's basically you can tell the AI, you can tell the AI that's leaving the company with the employee.
795
01:19:38.070 --> 01:19:50.910
Mike Crispin: You're in a non-compete. Here are the consequences. If this data is used at your at this next job, at Jimmy's next job, and it's built into the AI. The AI will obey that if it doesn't obey that, then these are the consequences.
796
01:19:51.130 --> 01:19:55.609
Mike Crispin: And and it's all who's who's who's harmed. It's almost treating.
797
01:19:56.220 --> 01:20:00.229
Mike Crispin: treating the AI like a person and not like a data.
798
01:20:00.550 --> 01:20:07.950
Mike Crispin: hard drive. It's treating it like a person and having the building the relationship legally and being able to hold an entity.
799
01:20:08.190 --> 01:20:12.480
Mike Crispin: So if you bring an AI to work, you as the employee are now liable.
800
01:20:13.260 --> 01:20:19.660
Mike Crispin: so you're liable for your AI, and you're liable for yourself. Your own brain and the artificial brain.
801
01:20:20.220 --> 01:20:34.209
Mike Crispin: And you. And that's that's the only way I can see, at least in our current framework, how things could be handled another way out there thought that I had is, if what we're I know we're talking super crazy here. But does it change the hiring model?
802
01:20:34.640 --> 01:20:36.689
Mike Crispin: Or you're you're not working for.
803
01:20:37.100 --> 01:20:41.869
Mike Crispin: You're not hiring companies anymore. You have more of a simulation of experts
804
01:20:42.180 --> 01:20:55.340
Mike Crispin: loosely coupled experts that build companies and move in and out as needed all based on the the Social contract they have with the Ais and the people, because you can't control an entity anymore.
805
01:20:55.640 --> 01:21:00.730
Mike Crispin: So the hiring model changes the ways that everyone becomes a contractor.
806
01:21:00.960 --> 01:21:08.480
Mike Crispin: and it's, you know, could a lot of that could be possible if we don't have a way to centralize a lot of the control.
807
01:21:12.970 --> 01:21:15.889
Mike Crispin: Crazy crazy, I know, but just like.
808
01:21:16.110 --> 01:21:20.599
Kevin Dushney: Well, yeah, simply put, you know, your ability to detect like Exfil.
809
01:21:20.770 --> 01:21:32.470
Kevin Dushney: you know, into an Llm. Is going to be much more challenging than Oh, hey! This person copied a thousand documents to their, you know dropbox box, you know, whatever your you know, cloud storage of choices right?
810
01:21:32.630 --> 01:21:34.300
Kevin Dushney: How you gonna pick that up.
811
01:21:35.140 --> 01:21:35.700
Nathan McBride: So.
812
01:21:35.700 --> 01:21:36.300
Kevin Dushney: It's.
813
01:21:37.440 --> 01:21:40.038
Nathan McBride: Throw out 2 thoughts. One is that
814
01:21:40.650 --> 01:21:46.079
Nathan McBride: Back in mid-february this guy, named Elliot Hirschberg wrote an article
815
01:21:47.322 --> 01:21:50.870
Nathan McBride: essentially a bio about the Hub and spoke biotech model that.
816
01:21:50.870 --> 01:21:51.200
Kevin Dushney: Original.
817
01:21:51.200 --> 01:21:52.140
Nathan McBride: I was doing
818
01:21:53.070 --> 01:22:10.989
Nathan McBride: this kind of caught fire in a lot of places. But this idea that you no longer have centralized biotech companies. You have decentralized biotech companies, and they all kind of come together in various forms, almost like a desci model. It's fascinating, but the legitimacy of the model depends on a lot of data sharing.
819
01:22:11.720 --> 01:22:14.699
Nathan McBride: And what you just both talked about
820
01:22:15.250 --> 01:22:25.760
Nathan McBride: was, and I wrote a note here. AI, Ml, version one is the note I made, which is the AI Markup language, which is to say that there is a communication standard
821
01:22:26.020 --> 01:22:28.860
Nathan McBride: between AI environments, such that
822
01:22:29.020 --> 01:22:39.320
Nathan McBride: Mike goes ahead and makes his portable Llm. Brings it in. There has to be a defined level of, say, for instance, Pdf, Pdf. Is a standardized markup format.
823
01:22:40.430 --> 01:22:43.660
Nathan McBride: Or document portability. So
824
01:22:43.790 --> 01:22:51.539
Nathan McBride: in the same case, Mike's language would have to translate to mine, and by virtue of that translation and that and that information
825
01:22:51.810 --> 01:23:11.519
Nathan McBride: we would know if document was being exfilled because of the way the language is written to make that communication work. It's a huge standard idea. It would require A, a functioning government and BA whole list of nonprofits probably make this happen. But
826
01:23:12.190 --> 01:23:14.770
Nathan McBride: the general idea is that.
827
01:23:16.000 --> 01:23:28.060
Nathan McBride: yeah, Nate Mcbride walks around with his Nate Mcbride identity probably encrypted on blockchain, and he's able to walk from place to place and say, Here's me. Let me plug in.
828
01:23:28.670 --> 01:23:30.350
Nathan McBride: Here's everything I know.
829
01:23:30.920 --> 01:23:44.319
Nathan McBride: and then and wait for it because I have the big, the big closer here in a minute. But Nate walks in and says, here's my blockchain. I'm going to go ahead and plug in. Here's everything I know. I'll make your world better. But Nate's competing against
830
01:23:44.470 --> 01:23:58.229
Nathan McBride: maybe 1 million other blockchain entities that have that claim the same knowledge competition to prove that his Lm is actually structurally better than everyone else in order to gain that
831
01:23:59.660 --> 01:24:05.857
Nathan McBride: that job, and on and on, fucking it
832
01:24:06.270 --> 01:24:06.980
Kevin Dushney: Yeah.
833
01:24:07.380 --> 01:24:10.952
Nathan McBride: I gotta write some of this up later tonight. But this is
834
01:24:11.920 --> 01:24:15.999
Nathan McBride: I think I think the net net of this all is that. There's this idea
835
01:24:16.640 --> 01:24:22.012
Nathan McBride: that you know, Kevin, you said personal AI beats corporate AI, and
836
01:24:22.990 --> 01:24:25.389
Nathan McBride: and if we just leave it at that level
837
01:24:25.990 --> 01:24:35.480
Nathan McBride: we can. We can revisit this. But if I am so good, and I use one thing, and I come in your company, and you're like, no, we're only Claude.
838
01:24:35.900 --> 01:24:44.340
Nathan McBride: and I'm like, fuck you. I'm not staying here. That's equivalent to me walking a company and saying, Oh, you're 365. I only use Gmail. I'm out
839
01:24:44.720 --> 01:24:47.887
Nathan McBride: today and and say, sort of equivalency,
840
01:24:49.460 --> 01:24:51.830
Nathan McBride: we're not too far away, I think, from that moment.
841
01:24:52.290 --> 01:24:54.569
Nathan McBride: potentially, and I mean far in terms of like.
842
01:24:54.570 --> 01:25:05.800
Kevin Dushney: I agree. But you're, you know, in in this case you're far more invested because you could say, I don't like those this platform over that because of the features. But you don't have to retrain the whole thing
843
01:25:06.050 --> 01:25:16.730
Kevin Dushney: right versus you, you know, depending on when you enter, you may have spent years teaching Chat Gbt persistent memory to the point where it's answering your questions
844
01:25:17.020 --> 01:25:20.499
Kevin Dushney: very precisely to how you want them to. And you got to start over.
845
01:25:20.670 --> 01:25:22.070
Kevin Dushney: That's a huge hurdle.
846
01:25:22.560 --> 01:25:28.710
Kevin Dushney: you know. It's like, well, we don't allow we don't allow personal, you know, Gpts or Lms at all. You gotta use ours.
847
01:25:28.830 --> 01:25:35.380
Kevin Dushney: Great. I'm back to square one, and I have no way to bring that in, and the company probably wouldn't allow it, even if you did.
848
01:25:36.260 --> 01:25:37.260
Nathan McBride: Exactly right.
849
01:25:37.370 --> 01:25:41.079
Nathan McBride: All you, I mean, just got huge problems.
850
01:25:42.095 --> 01:25:49.809
Nathan McBride: Well, I want to close out the AI part or try to a brief mention of risk.
851
01:25:51.420 --> 01:26:02.689
Nathan McBride: we know risk. We know how we apply risk. We apply risk using pretty common rubrics together. We're not like going to the ends of the earth trying to find risk. But we do
852
01:26:02.960 --> 01:26:24.430
Nathan McBride: think about it a lot. And and the way we think about risk, I think, ultimately determines how autonomous we are in our in our approaches towards risk management. So like, what's the risk? This thing goes down. What's the risk that we lose data. What's the risk? There's a breach like these are all calculable on almost any matrix scale.
853
01:26:25.078 --> 01:26:32.700
Nathan McBride: With established frameworks from, you know, NIST or Cis, or anything else. But AI AI reduces risks.
854
01:26:32.850 --> 01:26:43.959
Nathan McBride: And AI, in the colloquial sense that Mike's talking about introduces risks that are hard to predict. So what's the risk that an AI makes an unfair or incorrect decision
855
01:26:44.360 --> 01:26:46.979
Nathan McBride: like, how do you, fucking even measure that? Well.
856
01:26:46.980 --> 01:26:47.570
Mike Crispin: Oh, yeah.
857
01:26:47.570 --> 01:26:57.430
Nathan McBride: 80% chance. Let's pull that out of your butt. Then there's emergent behaviors. So as AI systems get better, are they getting better in a positive way.
858
01:26:57.570 --> 01:27:00.239
Nathan McBride: What's the risk that the AI gets better in a negative way.
859
01:27:01.240 --> 01:27:12.540
Nathan McBride: which doesn't really make a whole lot of literal sense until it does. And then there's explanation gaps. What's the risk? This AI is just unable to understand this thing
860
01:27:13.140 --> 01:27:16.050
Nathan McBride: or explain the decision at any point in time. Correctly.
861
01:27:16.917 --> 01:27:25.250
Nathan McBride: But what's the risk that if we over deep learn this? AI, it can go off in a tangent that we never thought of before. That's actually harmful.
862
01:27:25.950 --> 01:27:29.490
Nathan McBride: And then, lastly, the risk of over reliance.
863
01:27:30.900 --> 01:27:39.329
Nathan McBride: I think the last one to me is most prescient for what's happening right now.
864
01:27:40.284 --> 01:27:47.860
Nathan McBride: You know, there's I'm it's not a plug, but it's kind of a plug. There's this box AI roundtable next week.
865
01:27:48.410 --> 01:27:49.060
Mike Crispin: Okay.
866
01:27:49.280 --> 01:27:51.440
Nathan McBride: At convene in Boston.
867
01:27:52.380 --> 01:27:57.819
Nathan McBride: And yeah, among the people speaking, there is the head of AI at Moderna.
868
01:27:58.500 --> 01:28:00.409
Mike Crispin: You know. Big stretch there.
869
01:28:00.730 --> 01:28:01.389
Nathan McBride: And
870
01:28:02.390 --> 01:28:09.630
Nathan McBride: what I want to understand, and I'm not going to let this guy get off easy is tell me about your dependency risks.
871
01:28:10.230 --> 01:28:19.599
Nathan McBride: Tell me about how much money and effort you've put into making AI the cornerstone of your company's future, and then tell me about your risk mitigation.
872
01:28:20.480 --> 01:28:23.669
Nathan McBride: because I want to know, like, that's the part that I want to know about.
873
01:28:24.480 --> 01:28:28.460
Nathan McBride: Don't give a shit about how innovative and digital you are.
874
01:28:28.980 --> 01:28:33.530
Nathan McBride: how the fuck. Are you risk managing this prop bet that you've made?
875
01:28:35.450 --> 01:28:40.150
Nathan McBride: That's what I want to understand. So I think I think risk management is sort of the final part of this.
876
01:28:40.270 --> 01:28:59.670
Nathan McBride: You have to have continuous risk monitoring. And I'm talking like at a level never seen before. You have to have explainable AI explainable like at a documentation, explainable level, like almost a Qa explanation level Sop type shit, you have to have human AI collaboration. The humans must know how it works
877
01:28:59.990 --> 01:29:13.670
Nathan McBride: at all times, and somebody needs to audit that need to have diverse oversight. I'm talking like not just some random committee that comes in once a year. And does a, you know, passive look over the environment I'm talking about like routine.
878
01:29:14.150 --> 01:29:19.739
Nathan McBride: insane audits, especially if they're a public company, and then you have to have progressive implementation.
879
01:29:20.510 --> 01:29:25.459
Nathan McBride: You start with lower risk, and you have to prove over time how your model matured into higher risk.
880
01:29:26.370 --> 01:29:32.530
Nathan McBride: Those, I think, are are compensating controls you can have in place. They're not all of them.
881
01:29:32.700 --> 01:29:40.079
Nathan McBride: But I think risk is like the thing that nobody wants to talk about.
882
01:29:40.980 --> 01:29:44.470
Nathan McBride: I cannot wait for this succession next week. Oh, my gosh!
883
01:29:44.920 --> 01:29:46.549
Mike Crispin: Where does that name is that in town?
884
01:29:46.550 --> 01:29:49.489
Nathan McBride: Like it's like convene. Do you want me to send you the invite.
885
01:29:49.720 --> 01:29:50.490
Mike Crispin: Yeah, when is it?
886
01:29:50.490 --> 01:29:51.320
Kevin Dushney: He is.
887
01:29:51.320 --> 01:29:51.690
Mike Crispin: Yeah.
888
01:29:52.060 --> 01:29:55.263
Nathan McBride: I'll send you both. I'll send it both to you. It's on
889
01:29:56.000 --> 01:30:01.610
Nathan McBride: But people in the audience, I'll tell you right now. It's April 30, th
890
01:30:02.110 --> 01:30:07.400
Nathan McBride: the box AI. Summit. It's at convene at one Boston place.
891
01:30:09.940 --> 01:30:14.150
Nathan McBride: And that is from 2 30 to 6 Pm.
892
01:30:16.010 --> 01:30:20.349
Nathan McBride: And I will. I will send you to the actual invite. For this
893
01:30:21.890 --> 01:30:26.020
Nathan McBride: there was an invite link which I can forward to you both one moment.
894
01:30:27.920 --> 01:30:32.773
Nathan McBride: but the point is that they're no boxes
895
01:30:34.000 --> 01:30:39.810
Nathan McBride: boxes in 10 is good. They're trying to, you know. Go, they have their AI launch in a few weeks.
896
01:30:39.950 --> 01:30:45.715
Nathan McBride: and great for them, and
897
01:30:46.900 --> 01:30:49.619
Nathan McBride: no one's talking about the risk.
898
01:30:50.220 --> 01:30:50.900
Nathan McBride: It's.
899
01:30:50.900 --> 01:30:55.709
Kevin Dushney: Well, you raised one earlier about the lack of metadata, and how does it know?
900
01:30:55.930 --> 01:30:58.689
Kevin Dushney: Confidential, from confidential, from confidential.
901
01:30:59.010 --> 01:31:00.850
Nathan McBride: Yep, exactly.
902
01:31:01.100 --> 01:31:04.359
Kevin Dushney: That's a it's a great question like, Oh, we can do it automatically. How.
903
01:31:06.300 --> 01:31:11.150
Nathan McBride: And that's, you know, to Box's credit box has tried to answer this question
904
01:31:11.370 --> 01:31:14.370
Nathan McBride: and been unable to to. Now.
905
01:31:14.730 --> 01:31:19.960
Nathan McBride: then, they're really focusing on. But you can do this like you can do these things without.
906
01:31:19.960 --> 01:31:20.290
Nathan McBride: Oh, yeah.
907
01:31:20.290 --> 01:31:26.960
Nathan McBride: Oh, please, we're gonna get to the bottom of this matter pretty soon.
908
01:31:29.370 --> 01:31:43.470
Kevin Dushney: You know, because it's it's a similar challenge with, you know, if you, if you have permission, pervasive permissions to something and you're there's content. There, you shouldn't have access to is the AI gonna surface it? Or is it gonna honor those permission boundaries.
909
01:31:43.700 --> 01:31:44.540
Kevin Dushney: you know.
910
01:31:45.006 --> 01:31:54.200
Kevin Dushney: if there's if there, if there's no, you know, tagging, or what what have you to say? This is confidential, and it's only restricted to a certain set of people.
911
01:31:55.250 --> 01:31:59.250
Kevin Dushney: I don't think it opens a can of worms without a little bit more detail.
912
01:31:59.250 --> 01:32:01.070
Nathan McBride: Well, think of box
913
01:32:01.220 --> 01:32:11.610
Nathan McBride: in that box has 7 levels of folder access control, 2 levels of file access control. There's no control in there that says anything at all
914
01:32:12.470 --> 01:32:15.350
Nathan McBride: about that extra 8th level or the and
915
01:32:15.350 --> 01:32:21.400
Nathan McBride: level of AI control and AI visibility. Nothing.
916
01:32:22.780 --> 01:32:31.549
Nathan McBride: You get access to a folder, you get access to everything about it, and you can run your own AI queries up till this. The cows come home with.
917
01:32:32.100 --> 01:32:37.520
Nathan McBride: No, no checking on the shit you come up with. Then you can put it all right back in the box and poison the well.
918
01:32:38.620 --> 01:32:40.209
Nathan McBride: There's no zoom.
919
01:32:40.740 --> 01:32:43.559
Kevin Dushney: It's back to your risk question. So
920
01:32:47.610 --> 01:32:50.570
Kevin Dushney: how are you managing risk in that? In that scenario.
921
01:32:53.740 --> 01:32:56.019
Nathan McBride: Why? Because I'm making a note. So one second.
922
01:32:56.350 --> 01:32:56.890
Kevin Dushney: Yep.
923
01:32:59.730 --> 01:33:02.139
Mike Crispin: So once again super powerful.
924
01:33:02.260 --> 01:33:05.740
Kevin Dushney: However, at what potential risk?
925
01:33:07.320 --> 01:33:07.933
Nathan McBride: All right.
926
01:33:08.780 --> 01:33:13.270
Nathan McBride: We have AI af the f out of this AI.
927
01:33:13.460 --> 01:33:14.410
Kevin Dushney: I think so.
928
01:33:15.705 --> 01:33:16.070
Mike Crispin: Oh!
929
01:33:16.070 --> 01:33:25.960
Nathan McBride: Things. And and this is the whole point, like from episodes 9 through 12. We're talking about future tense. So we will come back to this topic because I feel like
930
01:33:26.640 --> 01:33:34.039
Nathan McBride: we asked a lot of questions here. I wrote down that we cannot answer tonight, but we will come back to no question, no question about it.
931
01:33:34.250 --> 01:33:35.160
Mike Crispin: Sure thing.
932
01:33:35.350 --> 01:33:40.009
Nathan McBride: Be funny, but I want to get to quantum computing, because I know Mike's itching.
933
01:33:43.060 --> 01:33:44.410
Mike Crispin: I love quantum.
934
01:33:45.540 --> 01:33:57.260
Nathan McBride: Quantum is something that point. Oh, oh, times a hundred 1 1 people and they're world, understand?
935
01:33:58.300 --> 01:34:03.940
Nathan McBride: But it's great and wonderful. And let's see our alternative selves, or at least the happy one that lives in the mansion.
936
01:34:05.560 --> 01:34:11.810
Nathan McBride: But it has implications for cryptography, you know.
937
01:34:12.528 --> 01:34:20.050
Nathan McBride: All things foundational to it. Autonomy, as we have discussed in the past. The coolest part about it is that
938
01:34:20.260 --> 01:34:24.959
Nathan McBride: for a quantum computer of sufficient power, anyway, not one that I have in my barn.
939
01:34:25.100 --> 01:34:28.420
Nathan McBride: You can break pretty much any encryption algorithm.
940
01:34:28.560 --> 01:34:29.830
Nathan McBride: They exist today.
941
01:34:30.616 --> 01:34:38.550
Nathan McBride: Rsa, Ecc, any other public key crypto systems. They're all vulnerable to a a decent quantum computer.
942
01:34:39.090 --> 01:34:39.700
Kevin Dushney: We did.
943
01:34:40.710 --> 01:34:51.950
Nathan McBride: So we cryptographers. Googling, this cryptographers call this particular problem that they have harvest, now decrypt later.
944
01:34:52.060 --> 01:35:09.359
Nathan McBride: which is the idea that. And and you know this has been a hacker thing for a while, but just get all this shit. Now get as much as you can, expel it all out, and we'll just sort it out later. Well, that's what quantum computing researchers are doing. They're thinking about, like, okay, just get, you know.
945
01:35:09.770 --> 01:35:17.090
Nathan McBride: 200 zettabytes of data, and we'll sort it out later. So
946
01:35:17.290 --> 01:35:23.120
Nathan McBride: bad actors can. I'm not sure there's actually 200 zetabytes of data in the world. But maybe there is
947
01:35:23.320 --> 01:35:31.309
Nathan McBride: bad actors can then get that data if they want it to be bad with quantum computing and just decrypt it when they have time.
948
01:35:32.024 --> 01:35:45.040
Nathan McBride: And with quantum computing being essentially something that's built by the hour or minute. There's a cost associated with sort of using quantum computing, and for it leaders that concern with maintaining autonomy
949
01:35:45.610 --> 01:35:50.909
Nathan McBride: when quantum computing becomes available to the average.
950
01:35:52.630 --> 01:35:59.979
Nathan McBride: Joe and Jane, 1st question what the fuck is quantum computing, how does it impact me.
951
01:36:00.530 --> 01:36:04.699
Nathan McBride: And of course, Gartner aside, nobody seems to know.
952
01:36:05.040 --> 01:36:08.850
Nathan McBride: 2. What would you do to your entire security program?
953
01:36:09.460 --> 01:36:10.430
Nathan McBride: 3.
954
01:36:11.137 --> 01:36:17.269
Nathan McBride: What would you do to your standards, especially for unstructured data.
955
01:36:17.670 --> 01:36:21.927
Nathan McBride: And then, lastly, from a mixed environment perspective,
956
01:36:22.570 --> 01:36:32.320
Nathan McBride: would you have to go all in on on quantum computing? Or would you be able to run quantum computing alongside what will then be called
957
01:36:32.420 --> 01:36:38.050
Nathan McBride: slow, as shit computing sounds. Computing.
958
01:36:38.735 --> 01:36:42.810
Nathan McBride: Yeah, what's the opposite of quantum.
959
01:36:43.340 --> 01:36:44.380
Mike Crispin: Do them.
960
01:36:44.980 --> 01:36:48.100
Nathan McBride: Know him, you know.
961
01:36:48.100 --> 01:36:51.100
Mike Crispin: Well, I think today's quantum computers are only
962
01:36:51.390 --> 01:36:58.220
Mike Crispin: good at certain things. They're they're not traditional computers, and they they're not. I'm not sure that they're gonna be
963
01:36:58.510 --> 01:36:59.829
Mike Crispin: designed to be
964
01:37:00.050 --> 01:37:09.541
Mike Crispin: computers like that we use every day that might be designed, at least in the next 5 to 10 years, to focus on particular problems like breaking encryption.
965
01:37:10.610 --> 01:37:18.399
Mike Crispin: but I do think you know, I know Nest is doing some stuff around this and Google as well, that are already building
966
01:37:18.620 --> 01:37:22.300
Mike Crispin: sort of quantum, safe cryptography, type
967
01:37:22.560 --> 01:37:30.429
Mike Crispin: tools and different levels of encryption that they're testing, though with today's processors they're awfully slow. Encryption is slow.
968
01:37:30.590 --> 01:37:41.750
Mike Crispin: So be that complicated. But the optimist in me will say that. Well, if you're going to have quantum computers that can break code, then hopefully, there'll be quantum computers that can make better
969
01:37:41.910 --> 01:37:50.819
Mike Crispin: code or better encryption. Question is whether or not that encryption will run on the traditional computers that we have even 10 years from now, or if
970
01:37:51.450 --> 01:37:56.219
Mike Crispin: that like you, said Nate, that that harvest now, Jacob, later. You know.
971
01:37:56.560 --> 01:38:17.269
Mike Crispin: you know they're gonna decrypt. All this stuff, anyway, it's being saved off somewhere else. And that data is at risk. So to answer that question is, there's nothing you can do about that. I don't think there's anything you can do to defend yourself. If if people are stealing encrypted data just like anything else. It's gone and you've missed it. So it's out there, and
972
01:38:17.900 --> 01:38:28.130
Mike Crispin: you know you're not sure it's gone. You're not sure where it is, or if it's been taken hopefully, you you do know. But if you're one of those victims that has lost a ton of encrypted data.
973
01:38:28.810 --> 01:38:34.705
Mike Crispin: then chances are you don't even know it's 3rd party risk or 4th party risk situation, that
974
01:38:35.680 --> 01:38:36.560
Nathan McBride: 4th party.
975
01:38:36.720 --> 01:38:53.449
Mike Crispin: Who knows? Well, I mean 4th party, like, let's say, aws, aws, S. 3 bucket that's encrypted was taken 5 years ago, and it has also. You have no idea it was taken, but it's encrypted because you thought it was safe because it was encrypted. So who cares who downloads it? You know. So I mean, I think that there's not much
976
01:38:54.050 --> 01:39:07.429
Mike Crispin: you can do there. I think the only thing is just to think about. It's maybe take an inventory of what you know that's encrypted and and and have understand what the dependencies are, you know, to have some sort of inventory
977
01:39:08.720 --> 01:39:12.900
Mike Crispin: that's kind of all your data. Really, that's encrypted.
978
01:39:13.620 --> 01:39:22.530
Mike Crispin: you know, even stuff that's with 3rd parties, though, is encrypted at rest. It just doesn't really matter like if they're compromised like, let's say, Aws or Google are compromised somehow. Or
979
01:39:22.920 --> 01:39:27.940
Mike Crispin: that data has just been pulled off. You don't even know it's gone. It's in the cloud model. So.
980
01:39:28.500 --> 01:39:34.320
Kevin Dushney: Yeah, I think until the economics are are real, though, that maybe that's a theory. Theoretical problem, right? Because.
981
01:39:34.320 --> 01:39:34.910
Mike Crispin: Yes, totally.
982
01:39:34.910 --> 01:39:42.549
Kevin Dushney: Yeah, it's so, it's so expensive. And and right now, if I'm understanding correctly, like the the, you know, the quantum computers are so ephemeral.
983
01:39:42.850 --> 01:39:43.379
Mike Crispin: They can.
984
01:39:43.380 --> 01:39:55.329
Kevin Dushney: Blitz and amount of of workload. That's amazing. But it just in a very short state. So you almost have to target the use case very tightly versus, hey? I'm just gonna start ingesting a bunch of data
985
01:39:55.450 --> 01:40:02.330
Kevin Dushney: and have this persistence running quantum. It's like it's not there yet. But if you want to go out after a high value key
986
01:40:03.170 --> 01:40:11.310
Kevin Dushney: because it's encrypting a ton of data like that that could be a really big target and worth the investment and the time to go get crack that key and unlock.
987
01:40:11.470 --> 01:40:16.559
Kevin Dushney: You know, a treasure trove of very, very valuable data like nation state stuff.
988
01:40:17.580 --> 01:40:18.100
Mike Crispin: I think
989
01:40:18.100 --> 01:40:29.169
Mike Crispin: I think Google Cloud, I think all already has quantum safe encryption as well as Gmail and some in some areas that are already working on that. And so I
990
01:40:29.530 --> 01:40:30.740
Mike Crispin: think that that
991
01:40:30.930 --> 01:40:39.059
Mike Crispin: Google Ibm, I think a lot of the quant big quantum players are working on a standard to try and combat from some of this, so I know it's certainly.
992
01:40:39.060 --> 01:40:39.610
Kevin Dushney: Yeah.
993
01:40:40.070 --> 01:40:45.420
Mike Crispin: It's certainly on their mind is a risk definitely. It's certainly something that people are afraid of.
994
01:40:45.700 --> 01:40:56.329
Kevin Dushney: But your point is, the is the standard something that current even Gpu's can can decrypt for legitimate business purposes? Or is it so advanced that
995
01:40:56.460 --> 01:40:57.030
Kevin Dushney: well.
996
01:40:57.520 --> 01:41:01.464
Mike Crispin: Great it's it's so seamless to encrypt the data.
997
01:41:01.920 --> 01:41:02.790
Kevin Dushney: Exactly.
998
01:41:05.520 --> 01:41:06.360
Mike Crispin: Well, good news.
999
01:41:06.360 --> 01:41:16.510
Mike Crispin: It's a great question. It's a great, it's a great question. And I don't think a lot of people are thinking about it, because the AI stuff is eclipsed. And they like you said, I'm not sure there's a huge understanding of what?
1000
01:41:16.840 --> 01:41:17.220
Mike Crispin: Yeah.
1001
01:41:17.220 --> 01:41:23.780
Mike Crispin: quantum computing is actually going to be used for maybe even what a quantum computer is, or why it's
1002
01:41:23.780 --> 01:41:24.750
Mike Crispin: important.
1003
01:41:25.441 --> 01:41:30.060
Mike Crispin: And who is investing right now in quantum computing.
1004
01:41:30.220 --> 01:41:40.570
Mike Crispin: you know, in terms of the circles of vendors that we're using? Who who are they who's actually in in that space? And why is it important for us to know, so I think it's a good thing to bring up, and
1005
01:41:41.020 --> 01:41:42.640
Mike Crispin: you know, to to talk about.
1006
01:41:43.180 --> 01:41:43.680
Kevin Dushney: Yeah.
1007
01:41:43.680 --> 01:41:48.230
Nathan McBride: I'm not. I'm not investing in cloud computing until my apple VR. Headset has it so
1008
01:41:51.636 --> 01:41:54.689
Nathan McBride: apple VR. Headset have quantum compute yet.
1009
01:41:54.940 --> 01:41:56.639
Mike Crispin: With that I've just got
1010
01:41:56.640 --> 01:42:02.760
Mike Crispin: AI, and then they'll probably get a new one in sick in 3 months. So it's completely useless.
1011
01:42:02.760 --> 01:42:05.839
Nathan McBride: Waiting for the apple. VR. Pro headset 2 to come out.
1012
01:42:06.170 --> 01:42:06.830
Kevin Dushney: Not gonna happen.
1013
01:42:06.830 --> 01:42:09.500
Nathan McBride: He'll laugh. He'll laugh at Mike when he buys that one.
1014
01:42:09.730 --> 01:42:11.740
Nathan McBride: and then I'll buy version 6.
1015
01:42:14.480 --> 01:42:14.850
Kevin Dushney: Sure.
1016
01:42:14.850 --> 01:42:16.370
Mike Crispin: I learned my lesson.
1017
01:42:17.270 --> 01:42:17.930
Nathan McBride: Then they kill it.
1018
01:42:19.010 --> 01:42:21.769
Mike Crispin: No, it's they're still developing on it, I think, for the next.
1019
01:42:21.770 --> 01:42:25.889
Kevin Dushney: Okay? No, but but in terms of in terms of the hardware. I know they're releasing software. But.
1020
01:42:25.890 --> 01:42:28.280
Mike Crispin: Yeah, they stopped the production. I think, yeah.
1021
01:42:28.280 --> 01:42:30.239
Nathan McBride: Yeah, that's what I heard last. But
1022
01:42:30.240 --> 01:42:33.409
Nathan McBride: glass team went that went over to apples, and that would always fired again.
1023
01:42:36.260 --> 01:42:44.560
Nathan McBride: So for our fearless it leaders that are out there who are all shitting themselves at the moment.
1024
01:42:44.680 --> 01:42:46.759
Nathan McBride: I've listened to this tale of doom.
1025
01:42:47.730 --> 01:42:50.120
Nathan McBride: Here's a couple things that you can prepare for.
1026
01:42:50.630 --> 01:42:51.170
Mike Crispin: Oh!
1027
01:42:51.620 --> 01:42:52.579
Nathan McBride: 1st of all.
1028
01:42:52.800 --> 01:42:58.409
Nathan McBride: you're not already doing this, and you should be because it's just general basic it leadership, one on one.
1029
01:42:58.600 --> 01:43:07.450
Nathan McBride: You should be staying informed about NIST, all their updates and other organizations progress on quantum resistance standards, Google, it
1030
01:43:07.610 --> 01:43:11.209
Nathan McBride: go to especially go to law conferences.
1031
01:43:11.320 --> 01:43:14.309
Nathan McBride: Lawyers love this shit. They talk about it all the time.
1032
01:43:14.550 --> 01:43:17.880
Nathan McBride: and you should follow the standards development line.
1033
01:43:18.580 --> 01:43:19.850
Nathan McBride: Okay? 2.
1034
01:43:21.390 --> 01:43:23.360
Nathan McBride: You should understand today.
1035
01:43:23.500 --> 01:43:30.440
Nathan McBride: which is probably a 0 score. We should understand, anyway, where and where and how cryptography is used across your systems.
1036
01:43:31.100 --> 01:43:35.439
Nathan McBride: if anything's using it, how's using it? How do you rate it? What's the scale.
1037
01:43:35.690 --> 01:43:37.510
Nathan McBride: What's it look like? What's it called?
1038
01:43:37.670 --> 01:43:41.140
Nathan McBride: You should understand this even better
1039
01:43:41.300 --> 01:43:43.710
Nathan McBride: go ahead and ask your vendors. Where are they going with it?
1040
01:43:44.030 --> 01:43:45.510
Nathan McBride: The vendors you care about, anyway.
1041
01:43:46.500 --> 01:43:47.540
Nathan McBride: See what they say.
1042
01:43:47.850 --> 01:43:52.990
Nathan McBride: You should be able to, probably not in 2025,
1043
01:43:53.200 --> 01:43:56.459
Nathan McBride: 26, or 27. But down the road
1044
01:43:56.560 --> 01:44:03.020
Nathan McBride: at the very least design what you're building to easily switch between algorithms that are cryptographic in nature
1045
01:44:03.260 --> 01:44:05.550
Nathan McBride: as standards might evolve.
1046
01:44:06.370 --> 01:44:13.610
Nathan McBride: And lastly, identify those systems in your company that would need quantum resistant approaches
1047
01:44:14.130 --> 01:44:18.930
Nathan McBride: based on data, sensitivity, and longevity. And that last one, I think, is key.
1048
01:44:19.270 --> 01:44:23.559
Nathan McBride: It's no different and preparing for any other audit.
1049
01:44:23.910 --> 01:44:26.240
Nathan McBride: In this case you're simply asking yourself.
1050
01:44:27.183 --> 01:44:39.259
Nathan McBride: so all my crown jewels are in box. So should I prioritize box. Yes, and all my access control happens to Octa, should I? Yes.
1051
01:44:39.930 --> 01:44:52.949
Nathan McBride: the things that matter most to the data you care about, the most to prioritize in terms of your transitional efforts, your controls, your agility, and your dependency management, all those things.
1052
01:44:53.290 --> 01:44:55.869
Nathan McBride: That's the global sum of it.
1053
01:44:56.380 --> 01:44:59.990
Nathan McBride: You should also not invest in apple AR headsets.
1054
01:45:02.960 --> 01:45:08.130
Nathan McBride: Mike says Mike, has gone great lengths tested out for the entire audience.
1055
01:45:09.720 --> 01:45:11.260
Mike Crispin: Determined that.
1056
01:45:12.680 --> 01:45:14.969
Nathan McBride: Unless you're a Minecraft player.
1057
01:45:15.750 --> 01:45:17.310
Mike Crispin: Minecraft doesn't work.
1058
01:45:17.883 --> 01:45:19.456
Nathan McBride: Does orna work in
1059
01:45:20.210 --> 01:45:22.660
Mike Crispin: No, can't play orna on there, either.
1060
01:45:23.480 --> 01:45:24.260
Nathan McBride: Okay, so.
1061
01:45:24.260 --> 01:45:26.770
Mike Crispin: Jetpack. Joyride is as good as it gets.
1062
01:45:26.770 --> 01:45:31.249
Nathan McBride: Did you do like a ninja ninja sushi fruit.
1063
01:45:31.710 --> 01:45:34.005
Mike Crispin: Yeah, they. They have those games, too, like the
1064
01:45:35.090 --> 01:45:35.450
Mike Crispin: Thank you.
1065
01:45:35.450 --> 01:45:36.060
Mike Crispin: James.
1066
01:45:36.560 --> 01:45:37.580
Nathan McBride: Angry words.
1067
01:45:37.820 --> 01:45:43.689
Mike Crispin: No angry birds I know of. I haven't played a lot of games. The immersive stuff is interesting, but that's about it right now.
1068
01:45:47.530 --> 01:45:48.490
Nathan McBride: Okay? Well.
1069
01:45:48.930 --> 01:45:58.989
Nathan McBride: all right. So quantum computing, it's out there. It's coming. If AI doesn't end your job first, st quantum computing surely will. So just give up.
1070
01:45:59.350 --> 01:46:00.670
Nathan McBride: That's basically what Rock's saying.
1071
01:46:02.860 --> 01:46:04.190
Nathan McBride: Just take hit.
1072
01:46:04.480 --> 01:46:08.009
Kevin Dushney: Capitulation has arrived. Yeah, it's over.
1073
01:46:08.650 --> 01:46:13.480
Nathan McBride: Go ahead and write resignation, letter, and figure out how to do plumbing.
1074
01:46:13.940 --> 01:46:17.430
Mike Crispin: Hopefully. You'll be retired by the time this all hits the fan.
1075
01:46:17.430 --> 01:46:21.580
Nathan McBride: Yeah, we need gonna need a lot of plumbers in about 10 years.
1076
01:46:21.990 --> 01:46:23.199
Kevin Dushney: And electricians.
1077
01:46:23.340 --> 01:46:24.660
Nathan McBride: Electricians, so.
1078
01:46:24.660 --> 01:46:24.990
Kevin Dushney: Yep.
1079
01:46:25.210 --> 01:46:32.450
Nathan McBride: Too late. It takes, I think, 8 years to get your apprentice license for electrician, another 3 to get your master license.
1080
01:46:32.590 --> 01:46:39.620
Nathan McBride: You got this people. Go to your nearest rogue tech school enroll. Now tell them
1081
01:46:39.940 --> 01:46:57.079
Nathan McBride: you've heard about the future from calculus of it, podcast give you the calculus of it, podcast referral, discount, and just use code YOUR. EFUC, KED, and you will get 20% off your enrollment for your votex school of choice.
1082
01:47:01.626 --> 01:47:07.769
Nathan McBride: The last, the last nail in this coffin of doom
1083
01:47:08.890 --> 01:47:13.630
Nathan McBride: is the role of edge computing, and I don't even I'm so afraid to go down this hole
1084
01:47:13.820 --> 01:47:16.610
Nathan McBride: with the 2 of you. But edge tech.
1085
01:47:17.360 --> 01:47:18.580
Mike Crispin: Oh, boy!
1086
01:47:19.410 --> 01:47:27.020
Nathan McBride: I mean, ever since we were at that gardener, or whatever 13 or 14, and the container stuff was going on, the docker and all the edge stuff
1087
01:47:27.190 --> 01:47:31.919
Nathan McBride: I've just been thinking for like 12 or 13 years now.
1088
01:47:32.560 --> 01:47:41.840
Nathan McBride: who gives a shit? So I mean, but edge computing is changing on a level that
1089
01:47:42.050 --> 01:47:46.139
Nathan McBride: well, if you just kind of been closing your eyes last few years, like I have about it.
1090
01:47:46.736 --> 01:47:54.860
Nathan McBride: It's changed enough that it matters, especially in a world of wearables and well, in our industry.
1091
01:47:55.020 --> 01:48:00.390
Nathan McBride: the FDA approving all kinds of random RFID shit so.
1092
01:48:02.300 --> 01:48:02.980
Kevin Dushney: And
1093
01:48:04.210 --> 01:48:06.479
Nathan McBride: The architectures. We're all familiar with
1094
01:48:06.940 --> 01:48:26.820
Nathan McBride: traditional centralized architectures. Data moves itself to where the computing happens. So you have data that's coming in goes to where the compute is no big deal. I/O is your your biggest problem in the edge. Of course, computing moves to where the data is generated. It goes the other direction, and it creates opportunities and challenges for it. Autonomy
1095
01:48:27.440 --> 01:48:29.930
Nathan McBride: opportunities are
1096
01:48:30.422 --> 01:48:36.770
Nathan McBride: you have reduced cloud cloud dependency as computing can reduce. We're talking about reliance and centralized cloud providers.
1097
01:48:37.050 --> 01:48:38.539
Nathan McBride: Your deed of sovereignty.
1098
01:48:38.710 --> 01:48:52.680
Nathan McBride: always a big plus processing data locally can help maintain control comply with data localization requirements, especially if you're a global company network resilience edge systems functioning, even if your WAN pipe is down
1099
01:48:53.380 --> 01:49:05.029
Nathan McBride: or you're limited to a sort of central cloud server. And lastly, latency advantages. Put the compute over at the edge, and you're going to get the data better to the servers faster.
1100
01:49:06.070 --> 01:49:14.329
Nathan McBride: The Nasdaq knows this well. So from a challenge perspective. You have, of course, now, a highly treated environment.
1101
01:49:14.620 --> 01:49:17.520
Nathan McBride: Be far more complex and centralized environment.
1102
01:49:17.980 --> 01:49:23.489
Nathan McBride: You have security service expansion every time you add an edge component. You're increasing your fabric.
1103
01:49:24.640 --> 01:49:27.430
Nathan McBride: Which basically means more places to attack.
1104
01:49:27.550 --> 01:49:35.830
Nathan McBride: You're having a problem with consistency maintenance. Everything needs to be at a standard all the time things can fluctuate in terms of standards.
1105
01:49:36.010 --> 01:49:41.109
Nathan McBride: and then, lastly, skill gaps edge. Computing is not for the faint of heart.
1106
01:49:41.510 --> 01:49:43.030
Nathan McBride: Your your
1107
01:49:43.130 --> 01:49:59.059
Nathan McBride: going to be difficult. You're gonna have difficult time in finding a company that can support your edge environment in totem. You'd probably have to distribute among multiple vendors depending on where you live and what they're doing, and in terms of internal resourcing, very hard to find.
1108
01:50:00.160 --> 01:50:06.680
Nathan McBride: So I'll pause right there and ask you guys. If there's any other opportunities or challenges you can think of for edge computing.
1109
01:50:09.240 --> 01:50:10.439
Nathan McBride: This is nurse.
1110
01:50:10.540 --> 01:50:13.550
Nathan McBride: I didn't spend a whole lot of time writing about this topic, but
1111
01:50:14.080 --> 01:50:15.930
Nathan McBride: it definitely is a future issue.
1112
01:50:17.430 --> 01:50:19.580
Mike Crispin: Yeah, I mean, one of the things
1113
01:50:20.150 --> 01:50:26.950
Mike Crispin: not necessarily specific to just edge computing. But more localized computing is that
1114
01:50:28.232 --> 01:50:35.787
Mike Crispin: that concern of protection protecting IP with localized AI, and and some of the needs to
1115
01:50:36.430 --> 01:50:40.050
Mike Crispin: improve performance in that space without wanting to
1116
01:50:40.850 --> 01:50:46.520
Mike Crispin: either abide by the guardrails of a of an AI vendor
1117
01:50:46.910 --> 01:50:50.350
Mike Crispin: and wanting to build your own rule set. That's proprietary.
1118
01:50:50.970 --> 01:50:56.420
Mike Crispin: That could certainly require the need for you having an edge and then having a localized
1119
01:50:56.750 --> 01:51:00.020
Mike Crispin: set of servers, gpus and whatnot.
1120
01:51:00.450 --> 01:51:07.320
Mike Crispin: Those are the scarcity of getting the compute and power that you need. You may not be able to get readily in the cloud.
1121
01:51:08.125 --> 01:51:11.040
Mike Crispin: That has localized speed and performance.
1122
01:51:11.180 --> 01:51:18.082
Mike Crispin: So they, I think there's also that component just on the AI side of the world. But outside of the AI pieces, I think.
1123
01:51:20.651 --> 01:51:28.500
Mike Crispin: you mentioned the the low latency in the labs and the components of of in our environment and needing having low latency to the Internet.
1124
01:51:31.370 --> 01:51:39.739
Mike Crispin: disaster recovery, I think, as another thing come back into play more and more over time, as we see more more outages and more issues.
1125
01:51:39.970 --> 01:51:46.869
Mike Crispin: wouldn't be surprised to see more edge computing emerge for more mission critical applications with the
1126
01:51:47.920 --> 01:51:59.789
Mike Crispin: some of the global issues we're having and concerns. Certainly, I think there's some businesses that might think, hey, we should just start building this in our data center and
1127
01:52:01.240 --> 01:52:03.120
Mike Crispin: hold things closer to the vest.
1128
01:52:04.115 --> 01:52:08.740
Mike Crispin: Hmm, so I I think there's also that component of things.
1129
01:52:09.430 --> 01:52:15.890
Mike Crispin: But but that flips the talent need on its head. Okay, now it's like, okay, do I need
1130
01:52:16.680 --> 01:52:26.644
Mike Crispin: kind of rebuild the devops model, you know, that we've built so heavily in the cloud? Does that now all come into back to an infrastructure and
1131
01:52:27.330 --> 01:52:33.979
Mike Crispin: or appliance model. That's a whole New Learning opportunity for people that are on site.
1132
01:52:34.270 --> 01:52:37.546
Mike Crispin: I don't know. I don't know if it's gonna be worth the squeeze. But
1133
01:52:38.570 --> 01:52:47.310
Mike Crispin: I think that's some of the future driven stuff. I would foresee especially with instrumentation of public devices, things that are.
1134
01:52:47.850 --> 01:52:48.170
Nathan McBride: Yeah.
1135
01:52:48.170 --> 01:52:55.279
Mike Crispin: Street lights and stuff like that. We're not gonna put. We not put that in the cloud anymore. You may put that localized on the edge at an office.
1136
01:52:56.820 --> 01:53:07.486
Mike Crispin: so I don't know. It's I think it's a question if we get hacked by another by a foreign country or something that's massive. Or there's some threat that we avoid that might draw
1137
01:53:09.200 --> 01:53:14.060
Mike Crispin: some companies and and government systems to a more localized model.
1138
01:53:15.940 --> 01:53:16.890
Mike Crispin: Large computing.
1139
01:53:16.890 --> 01:53:20.849
Nathan McBride: Yeah. Ultimately, when I thought about this a little bit.
1140
01:53:21.270 --> 01:53:26.150
Nathan McBride: just from a future perspective in terms of autonomy, I was thinking about
1141
01:53:27.340 --> 01:53:30.470
Nathan McBride: it's capability design. So you would have.
1142
01:53:31.150 --> 01:53:47.469
Nathan McBride: I mean, we used to call this site in a box when we were working at Amag. We were standing up remote sites. It was a site in a box which is, okay. We're going to put all this stuff out in this particular site in this remote area. And it's on the edge. It's all working independently, but every now and then it's reporting home. It's reporting home in a batch
1143
01:53:47.850 --> 01:53:54.610
Nathan McBride: wasn't truly edge, but in a way it was working that way. It was capability based design, or a site in a box called it.
1144
01:53:54.820 --> 01:53:57.270
Nathan McBride: There was the autonomous operation principles.
1145
01:53:57.530 --> 01:54:02.129
Nathan McBride: Again, future thinking which is designing your systems that can function independently.
1146
01:54:02.410 --> 01:54:12.830
Nathan McBride: So imagine the site in a box principle where you put a whole bunch of stuff in some location, and even if the rest of the road goes down, it still operates independently, and when it can it will call home.
1147
01:54:14.230 --> 01:54:29.760
Nathan McBride: But it can operate by itself for a very, very long period of time. It's very, that's very sci-fi related, actually. And there's mesh approaches which is peer to peer communication rather than hub, and spoke, which is, your edges are all talking to each other, and then there's like some sort of
1148
01:54:29.940 --> 01:54:36.509
Nathan McBride: parent edge which is every now and then commuting communicating home, that all the other edges are okay.
1149
01:54:37.348 --> 01:54:40.300
Mike Crispin: Progressive centralization, which is, you start with local.
1150
01:54:40.640 --> 01:54:52.629
Nathan McBride: Processing, and then you expand globally. There's nothing that says unless you're like a latency driven organization, nothing that says you can't have edge computing, existing anywhere in the world.
1151
01:54:53.536 --> 01:54:56.080
Nathan McBride: back to various hubs.
1152
01:54:56.260 --> 01:54:56.770
Mike Crispin: Sure.
1153
01:54:57.720 --> 01:55:05.469
Nathan McBride: Again models to consider, but from an autonomy perspective. If you're simply allowing edge to exist
1154
01:55:05.710 --> 01:55:10.579
Nathan McBride: from whatever the vendor tells you, or from an edge principle that is outdated.
1155
01:55:10.810 --> 01:55:14.420
Nathan McBride: Yeah, you're you're not really pushing the ball forward. I think.
1156
01:55:15.420 --> 01:55:25.239
Mike Crispin: I think just you mentioned Docker earlier and the more applications that are sas based that provide a a docker Mini, a Mini docker version
1157
01:55:26.810 --> 01:55:31.589
Mike Crispin: are things that I think it leaders should be looking at to some extent to see
1158
01:55:32.180 --> 01:55:40.212
Mike Crispin: who, who, if any, are gun, are planning to offer anything like that, you know. Azure had the azure appliance back in the in the day before.
1159
01:55:41.030 --> 01:55:47.610
Mike Crispin: you know that they went full in, and that was putting azure in a box on on site, I mean. Similarly, I think.
1160
01:55:47.610 --> 01:55:47.980
Kevin Dushney: Yes.
1161
01:55:47.980 --> 01:55:56.259
Mike Crispin: Most of these apps that we're running, you know, could probably run in a docker at a very slow speed that keeps you running. If there's some disaster, and I'll let.
1162
01:55:56.260 --> 01:55:57.340
Kevin Dushney: It's not more than mine.
1163
01:55:57.340 --> 01:55:57.940
Mike Crispin: Desk.
1164
01:55:58.230 --> 01:56:04.440
Kevin Dushney: Is that more serverless than than edge? Or am I my confusing or conflating.
1165
01:56:04.440 --> 01:56:05.900
Nathan McBride: Yeah, because just.
1166
01:56:06.270 --> 01:56:17.539
Kevin Dushney: You know Kubernetes and Docker particularly. I mean, you can fire that up on a synology now. So, Mike, to your point, if you needed it for a a Dr. Play. Yeah, you could. You could run
1167
01:56:18.660 --> 01:56:28.970
Kevin Dushney: those containers onto a laptop or some, you know, a thousand dollar synology. Nas, I'm running a docker, for you know, home automation on my synology. Nas.
1168
01:56:29.410 --> 01:56:29.990
Kevin Dushney: yeah.
1169
01:56:30.380 --> 01:56:32.530
Mike Crispin: It's not hard for you to spin up right. No.
1170
01:56:32.530 --> 01:56:34.530
Kevin Dushney: No, it's not. It took 10 min.
1171
01:56:34.530 --> 01:56:35.176
Mike Crispin: Yeah, you.
1172
01:56:35.850 --> 01:56:41.740
Mike Crispin: It's it's hugely useful. And I'd say pretty much, I think slack runs in the docker. An instance.
1173
01:56:41.740 --> 01:56:52.429
Nathan McBride: Remember that though the difference so yeah, edge can be confused with not like various remote cloud architectures, very easily keep in mind that
1174
01:56:53.260 --> 01:57:02.700
Nathan McBride: that. Traditionally, what we're talking about is where where the compute happens. Compute is all happening on the edge. Then it's edge.
1175
01:57:03.070 --> 01:57:03.640
Kevin Dushney: Right.
1176
01:57:03.640 --> 01:57:09.750
Nathan McBride: Compute tapping locally is just it's just doing data gathering on the edge. That's not edge computing, that's.
1177
01:57:10.850 --> 01:57:12.360
Nathan McBride: you know, traditional like what we.
1178
01:57:12.360 --> 01:57:16.880
Kevin Dushney: It's just a different modality. Yeah, serverless. But local. Yeah.
1179
01:57:16.880 --> 01:57:21.250
Nathan McBride: So if the compute is all happening out there, that's that's edge computing.
1180
01:57:21.980 --> 01:57:30.379
Nathan McBride: I mean, at least as I come to understand her over the years. Yeah, but that's what I always, you know, sort of come back to as my basis for discussion.
1181
01:57:30.850 --> 01:57:37.740
Mike Crispin: It's it's sort of an enabler. I guess it enables it. Your. The ability to the docker component enables the
1182
01:57:37.980 --> 01:57:43.350
Mike Crispin: you to to start to build that infrastructure easily, without a huge cost and upfront.
1183
01:57:43.610 --> 01:57:44.170
Nathan McBride: Yeah.
1184
01:57:44.340 --> 01:57:54.039
Nathan McBride: I mean, most commercial retail stores are operating in an edge type model where all the happening at that Po, you know the particular.
1185
01:57:54.040 --> 01:57:54.620
Kevin Dushney: Pos.
1186
01:57:54.620 --> 01:57:58.710
Nathan McBride: And then they're going back to the to the mother mothership.
1187
01:57:59.300 --> 01:58:02.999
Nathan McBride: That's not truly edge necessarily, because each each
1188
01:58:03.230 --> 01:58:20.073
Nathan McBride: shop is operating its own entity. But there's an edge issue to that which is like when the mothership decides to push out Xyz Updates to the cost of donuts. All the dunkin donuts pick it up, and then they're able to from there compute. Do their own compute locally
1189
01:58:20.820 --> 01:58:24.319
Nathan McBride: principally. I think we're all on the same page.
1190
01:58:24.550 --> 01:58:32.360
Mike Crispin: Yeah, I I think of the iphone as an edge computing device. A lot of the compute is happening on device. It's containerized.
1191
01:58:32.580 --> 01:58:38.059
Mike Crispin: It's it's not going out to the cloud. It's doing more on the device. It's an edge device.
1192
01:58:39.240 --> 01:58:44.870
Mike Crispin: It's using edge over over cloud. In a lot of instances more more unlike other systems.
1193
01:58:48.390 --> 01:58:50.649
Nathan McBride: The app. The iphone is very good for edging.
1194
01:58:51.210 --> 01:58:52.609
Mike Crispin: Yes, it's great for edging.
1195
01:58:58.520 --> 01:58:59.130
Nathan McBride: Oh!
1196
01:59:00.290 --> 01:59:02.540
Mike Crispin: There's your next, there's your next t-shirt, Nate.
1197
01:59:02.540 --> 01:59:03.060
Nathan McBride: Bye.
1198
01:59:03.360 --> 01:59:04.590
Kevin Dushney: Exactly.
1199
01:59:04.590 --> 01:59:05.200
Mike Crispin: Yeah.
1200
01:59:06.390 --> 01:59:09.649
Nathan McBride: I am way behind on T-shirt development. Right now.
1201
01:59:11.340 --> 01:59:13.120
Nathan McBride: Okay, okay. So.
1202
01:59:13.580 --> 01:59:16.410
Mike Crispin: Okay, so listen, we have literally another.
1203
01:59:17.050 --> 01:59:19.959
Nathan McBride: Whole episode in the script.
1204
01:59:20.497 --> 01:59:25.500
Nathan McBride: That we haven't gotten to yet, so we can either.
1205
01:59:25.760 --> 01:59:27.340
Kevin Dushney: Sally forth.
1206
01:59:27.730 --> 01:59:30.619
Nathan McBride: And spend another hour or so
1207
01:59:31.315 --> 01:59:39.779
Nathan McBride: chipping away the rest of the script, because there are other things that we have set baselines on before we move into episodes 10 through 12.
1208
01:59:40.810 --> 01:59:42.739
Nathan McBride: Or we can pause here
1209
01:59:44.070 --> 01:59:48.739
Nathan McBride: because we still have to get into technology cycles. We have to get into upskilling
1210
01:59:49.704 --> 01:59:54.720
Nathan McBride: which is obviously a huge element. We have to get into search and information discovery.
1211
01:59:55.780 --> 01:59:57.570
Nathan McBride: We've given data sovereignty.
1212
01:59:57.700 --> 02:00:03.569
Kevin Dushney: And data. Sovereignty may be the most key of all of these that we've talked about.
1213
02:00:04.238 --> 02:00:09.290
Nathan McBride: Because back to the point of you know that bioengineer that walks in.
1214
02:00:09.680 --> 02:00:10.180
Kevin Dushney: Yep.
1215
02:00:10.180 --> 02:00:12.690
Nathan McBride: A sneaky poof, or whatever his name was.
1216
02:00:13.830 --> 02:00:20.589
Nathan McBride: Data sovereignty is going to be the discussion of our of our entire lives.
1217
02:00:22.610 --> 02:00:31.729
Nathan McBride: If if ever like, we'll be sitting on that front porch rocker, sipping our lemonade with little Jack Daniels in it. We're 85 talking about.
1218
02:00:32.260 --> 02:00:36.569
Nathan McBride: I wish I owned my own data. But I don't know who owns my data. Right now.
1219
02:00:38.790 --> 02:00:40.730
Nathan McBride: data sovereignty will be the discussion.
1220
02:00:41.720 --> 02:00:49.160
Nathan McBride: So what do you wanna do you? Wanna you wanna pause here and then make a 3rd part 3.
1221
02:00:54.090 --> 02:00:55.289
Kevin Dushney: What do you think, Mike?
1222
02:00:55.930 --> 02:01:02.639
Nathan McBride: I spend a few minutes well, we want to pause here on the script and then talk about AI a little bit more.
1223
02:01:04.245 --> 02:01:05.849
Mike Crispin: That's okay.
1224
02:01:06.182 --> 02:01:07.510
Kevin Dushney: I think we've yeah.
1225
02:01:07.510 --> 02:01:09.590
Mike Crispin: Beating that up pretty good.
1226
02:01:11.650 --> 02:01:13.430
Kevin Dushney: We beat that up pretty good. Yeah.
1227
02:01:14.830 --> 02:01:25.230
Mike Crispin: Yeah, let's say, I, we I think we should pick it up next time we can. We'll start right fresh. What I think we if we focus heavily on data sovereignty, I think that's a whole episode right? There. Yeah.
1228
02:01:25.230 --> 02:01:26.049
Kevin Dushney: I agree.
1229
02:01:26.760 --> 02:01:40.870
Nathan McBride: Well, let's let's not belabor the point like, let's try and make sure that for the next episode we wrap this whole thing up because we we do have some like really big fish to fry especially if it relates to.
1230
02:01:41.500 --> 02:01:47.079
Nathan McBride: I mean, so next next episode, is compliance
1231
02:01:47.790 --> 02:01:53.509
Nathan McBride: and regulatory requirements. And you know, just this is going to be a nightmare to get through.
1232
02:01:53.820 --> 02:01:54.940
Nathan McBride: So
1233
02:01:56.499 --> 02:02:09.069
Nathan McBride: working on the script now. But it's not. It's just not good, because I mean, it's gonna be a great episode, but because the whole idea of a compliance, anything is so fucking subjective
1234
02:02:09.310 --> 02:02:13.849
Nathan McBride: that we just have to get through it, because it's important.
1235
02:02:14.180 --> 02:02:14.920
Mike Crispin: Yup!
1236
02:02:14.920 --> 02:02:26.919
Nathan McBride: And how much, how enforceable this stuff is for autonomy, and how people who are embracing autonomy just flip the bird at anything related to compliance and get away with it. And we're talking about all those things.
1237
02:02:26.920 --> 02:02:29.969
Nathan McBride: Yeah. So so I think we gotta wrap this up next week.
1238
02:02:30.120 --> 02:02:32.349
Nathan McBride: So how how about we do this next week?
1239
02:02:32.800 --> 02:02:35.880
Nathan McBride: We talk about technology cycles.
1240
02:02:37.880 --> 02:02:39.180
Nathan McBride: Future of search.
1241
02:02:40.290 --> 02:02:40.810
Mike Crispin: Yeah.
1242
02:02:41.130 --> 02:02:42.130
Kevin Dushney: That's a good one.
1243
02:02:42.560 --> 02:02:52.389
Nathan McBride: Data sovereignty and then building some semblance of a strategy to handle all this as a It leader.
1244
02:02:52.880 --> 02:02:53.590
Nathan McBride: Maybe we'll hit.
1245
02:02:53.590 --> 02:03:00.910
Mike Crispin: Nate, do you do you think the data sovereignty and compliance discussion kind of go together like those 2 things?
1246
02:03:01.370 --> 02:03:16.760
Mike Crispin: You, Matt, you, if you precursor data sovereignty, and we go into compliance because there's just like you said, there's 2 or 3 other things here that we need to get through that we that because compliance and data sovereignty. So those those are things that are they are. Gonna we're gonna.
1247
02:03:16.760 --> 02:03:18.510
Kevin Dushney: Those are meeting topics. Yeah.
1248
02:03:18.510 --> 02:03:22.159
Nathan McBride: We already hit on data sovereignty tonight, with regards to.
1249
02:03:22.660 --> 02:03:27.090
Nathan McBride: you know, if I'm still employed in 2028,
1250
02:03:27.690 --> 02:03:43.049
Nathan McBride: and I go to a new company. I'm gonna want to bring my shit with me now. It's like I've been coy. I've been coy about it up to this point in time like I bring my special decks and templates I've already made over the years, and I bring all the stuff and like ha! Ha! You know no, no, everything I create is new.
1251
02:03:43.600 --> 02:03:45.240
Nathan McBride: But now.
1252
02:03:45.240 --> 02:03:46.969
Mike Crispin: You're gonna want to bring your buddy with you.
1253
02:03:47.290 --> 02:03:49.800
Kevin Dushney: It's been happening forever, though now it's more sophisticated.
1254
02:03:50.240 --> 02:03:52.779
Nathan McBride: When you hire Nate, you also get old Nate
1255
02:03:53.540 --> 02:04:01.949
Nathan McBride: and and all. Nate cost this much. Nate cost this much, and we're gonna come in. And all Nate needs to be able to plug into your world.
1256
02:04:02.740 --> 02:04:12.310
Nathan McBride: that's talk about data sovereignty. That's true data sovereignty. That is me owning my digital twin a hundred percent
1257
02:04:12.970 --> 02:04:13.800
Nathan McBride: and letting.
1258
02:04:13.800 --> 02:04:19.800
Kevin Dushney: I like the I like the ability to ask Nate a question, and then alternate a question and choose the answer. I like better.
1259
02:04:21.620 --> 02:04:21.980
Kevin Dushney: Ha!
1260
02:04:21.980 --> 02:04:22.369
Nathan McBride: I will.
1261
02:04:22.370 --> 02:04:23.470
Nathan McBride: It will always
1262
02:04:23.470 --> 02:04:32.710
Nathan McBride: be all nate all Nate, unless I program all Nate to just hate everybody. It'll be like, well, what did Nate say?
1263
02:04:32.710 --> 02:04:33.230
Kevin Dushney: Yeah.
1264
02:04:33.230 --> 02:04:39.109
Nathan McBride: It'd be like, well need to do this. Well, fuck you, listen to me, and then you're like, Oh, well, I'm not listening. All need again.
1265
02:04:39.110 --> 02:04:39.719
Kevin Dushney: Yeah.
1266
02:04:40.330 --> 02:04:41.559
Nathan McBride: Ha! Ha! Ha!
1267
02:04:44.290 --> 02:04:44.860
Nathan McBride: Okay.
1268
02:04:44.890 --> 02:04:47.400
Kevin Dushney: A digital twin rabbit hole for another day.
1269
02:04:48.560 --> 02:04:50.439
Nathan McBride: Yes, imagine, imagine
1270
02:04:51.370 --> 02:05:00.000
Nathan McBride: I'm just gonna say this right now, because I'm not even gonna imagine it because it's gonna happen, you could create multiple digital twins like multiple Alts.
1271
02:05:00.100 --> 02:05:06.370
Nathan McBride: and they could be all different kinds of personalities. You could have like a absolute asshole. All
1272
02:05:07.330 --> 02:05:12.020
Nathan McBride: yeah, we're like, yeah, hey, can I get some help with fuck? You go away.
1273
02:05:12.520 --> 02:05:14.070
Nathan McBride: and then you could have like
1274
02:05:14.170 --> 02:05:19.129
Nathan McBride: totally laid back. Oh, man, that's terrible!
1275
02:05:19.310 --> 02:05:20.780
Nathan McBride: Oh, dude!
1276
02:05:20.970 --> 02:05:25.138
Kevin Dushney: You basically got an engineered multiple personality disorder.
1277
02:05:25.660 --> 02:05:35.000
Nathan McBride: Yeah, exactly like, well, people are like, Hey, Nate, yeah, what's up? Come on in. Well, I was talking to old Nate today, and which one?
1278
02:05:37.760 --> 02:05:39.119
Nathan McBride: I'm not sure.
1279
02:05:39.120 --> 02:05:40.649
Kevin Dushney: He told me to get the hell out of my office.
1280
02:05:40.650 --> 02:05:43.120
Nathan McBride: Somebody can help. Oh, oh, wow!
1281
02:05:43.120 --> 02:05:44.509
Kevin Dushney: 1. 0, okay.
1282
02:05:44.510 --> 02:05:50.729
Nathan McBride: Yeah, he was out all night last night. He's he's probably not a good one to talk to. Why don't you talk to surfer alt alt Nate?
1283
02:05:51.150 --> 02:05:54.899
Kevin Dushney: He ran out he ran out of tokens. He's very angry.
1284
02:05:59.940 --> 02:06:03.460
Nathan McBride: Dude. You know what this is all going to have by virtue of the fact that we said this.
1285
02:06:04.430 --> 02:06:12.830
Nathan McBride: when I upload this video to Youtube Youtube's gonna go ahead and tell Google Google's gonna build all bots.
1286
02:06:13.600 --> 02:06:20.850
Nathan McBride: Yeah, build like a some project 120 experiment. They're gonna roll it into Gemini.
1287
02:06:21.930 --> 02:06:26.349
Nathan McBride: 2 years from now you're going to be like, hey? Would you like to create alternative of yourself?
1288
02:06:27.000 --> 02:06:28.260
Nathan McBride: Press? Yes.
1289
02:06:28.450 --> 02:06:30.230
Kevin Dushney: Yeah, I'll just be.
1290
02:06:30.230 --> 02:06:32.049
Kevin Dushney: That'll just be a gem in the future
1291
02:06:35.690 --> 02:06:40.619
Kevin Dushney: yourself. Yep, this is so. This is so clever, Mike.
1292
02:06:40.620 --> 02:06:41.950
Mike Crispin: It's very clever.
1293
02:06:41.950 --> 02:06:45.230
Nathan McBride: It hurts so.
1294
02:06:45.230 --> 02:06:45.640
Mike Crispin: Clever.
1295
02:06:45.640 --> 02:06:46.240
Nathan McBride: Sure.
1296
02:06:48.780 --> 02:06:52.539
Nathan McBride: Can you use AI to make toasts? Not yet.
1297
02:06:53.930 --> 02:06:56.370
Nathan McBride: Today was administrative professionals. Day.
1298
02:06:56.370 --> 02:06:57.020
Mike Crispin: Can you.
1299
02:06:57.020 --> 02:07:05.610
Nathan McBride: Shout out to all those admin professionals, you you absolutely are not replaceable by AI. You kill it every day because you listen to our bullshit.
1300
02:07:05.720 --> 02:07:08.720
Nathan McBride: and you solve our little petty problems.
1301
02:07:09.450 --> 02:07:15.048
Nathan McBride: and somehow you always make sure the printer has paper. So God bless you all!
1302
02:07:17.620 --> 02:07:22.289
Nathan McBride: I hope you guys both got wonderful gifts for all the eas in your company today
1303
02:07:23.580 --> 02:07:24.789
Nathan McBride: they are the engine.
1304
02:07:25.570 --> 02:07:26.950
Mike Crispin: Yes, yes, you did.
1305
02:07:27.300 --> 02:07:31.270
Nathan McBride: What'd you get, Mike? I know, because I know Kate did some shopping for them. But what did you get.
1306
02:07:31.530 --> 02:07:37.817
Mike Crispin: We we still have something to give tomorrow, because they were one of them was out of the office. But we have the
1307
02:07:38.410 --> 02:07:44.820
Mike Crispin: the salty snack basket with some liquor and chocolates and other things.
1308
02:07:44.820 --> 02:07:46.160
Nathan McBride: Oh, yeah.
1309
02:07:46.160 --> 02:07:51.610
Mike Crispin: And then we have a AI generated picture of us. Walking in
1310
02:07:52.304 --> 02:07:56.700
Mike Crispin: handing over the basket, and me and Berto have the my ties, and we're.
1311
02:07:57.560 --> 02:07:58.090
Kevin Dushney: That's awesome.
1312
02:07:58.090 --> 02:08:04.720
Mike Crispin: And it's like a spooky cartoonified. It's so so awful!
1313
02:08:05.260 --> 02:08:10.890
Mike Crispin: That's terrible picture. I'll send it to you guys. It's terrible on slack.
1314
02:08:11.963 --> 02:08:17.336
Mike Crispin: It's so funny. But they look at like, Oh, yeah, this is great. Thanks.
1315
02:08:17.720 --> 02:08:19.240
Nathan McBride: Ha! Ha! Ha! Ha! Ha!
1316
02:08:20.170 --> 02:08:21.000
Kevin Dushney: Fringe.
1317
02:08:21.450 --> 02:08:23.510
Mike Crispin: Oh, that was fantastic!
1318
02:08:23.510 --> 02:08:34.119
Nathan McBride: And thank you to all those hardworking folks who are literally handling children, literally grown bearded children
1319
02:08:34.120 --> 02:08:35.579
Nathan McBride: make the company around.
1320
02:08:35.700 --> 02:08:41.929
Nathan McBride: They they make them get to their their flights, and that there's lunch that shows up on time.
1321
02:08:41.930 --> 02:08:43.380
Kevin Dushney: Meetings, endless meetings.
1322
02:08:43.380 --> 02:08:47.060
Nathan McBride: Meetings, and they make the zoom room look presentable, and
1323
02:08:48.160 --> 02:09:00.470
Nathan McBride: oh, my God, I couldn't do it! So thank you to all them, and on your behalf. We're going to tackle all these wonderful topics next week to close out Episode 9, so that we can eventually get to Episode 10
1324
02:09:03.080 --> 02:09:14.670
Nathan McBride: next week we'll be coming to you. Live actually after the box. AI digital sovereignty transformation.
1325
02:09:15.640 --> 02:09:20.620
Nathan McBride: AI Api summit in Boston.
1326
02:09:21.210 --> 02:09:25.120
Nathan McBride: Maybe we should do a remote session on next Wednesday, since we'll all be together.
1327
02:09:25.880 --> 02:09:28.650
Mike Crispin: Yep, that we can do.
1328
02:09:28.650 --> 02:09:32.669
Nathan McBride: Mike's not listening to what I said. He's just agreeing, so we'll talk about.
1329
02:09:32.670 --> 02:09:33.709
Mike Crispin: Let's do it.
1330
02:09:35.920 --> 02:09:37.180
Nathan McBride: What do what Mike?
1331
02:09:37.630 --> 02:09:38.960
Mike Crispin: Remote session, live.
1332
02:09:39.220 --> 02:09:39.810
Nathan McBride: Okay.
1333
02:09:40.190 --> 02:09:40.570
Mike Crispin: Yeah, yeah.
1334
02:09:40.800 --> 02:09:41.550
Kevin Dushney: That's for sure.
1335
02:09:41.990 --> 02:09:43.530
Kevin Dushney: At the Box Conference.
1336
02:09:44.120 --> 02:09:49.120
Nathan McBride: At the box. AI, digital, a AI, digital copies.
1337
02:09:49.770 --> 02:09:50.470
Kevin Dushney: Excellent.
1338
02:09:51.370 --> 02:09:52.030
Nathan McBride: Do it.
1339
02:09:52.030 --> 02:09:53.299
Mike Crispin: Live, we'll do it. Live!
1340
02:09:53.300 --> 02:09:54.060
Kevin Dushney: We'll do it. Live!
1341
02:09:54.060 --> 02:10:06.580
Nathan McBride: We'll do a live from the sports locker bar at the Omni Hotel in the sports locker room.
1342
02:10:07.010 --> 02:10:07.770
Mike Crispin: Nice.
1343
02:10:07.770 --> 02:10:10.880
Kevin Dushney: Is that where the is that where the session is? Is it a domino.
1344
02:10:10.880 --> 02:10:12.499
Nathan McBride: That that can be.
1345
02:10:13.120 --> 02:10:16.180
Kevin Dushney: Oh, oh, convenient! That's right. You said that. Okay.
1346
02:10:16.350 --> 02:10:17.430
Nathan McBride: I don't know where that is.
1347
02:10:17.470 --> 02:10:21.619
Kevin Dushney: Send me send the invite if you haven't. I haven't. I didn't. I didn't see it.
1348
02:10:21.620 --> 02:10:23.619
Nathan McBride: I sent both you on Gmail.
1349
02:10:23.860 --> 02:10:24.580
Kevin Dushney: Alright, cool.
1350
02:10:25.230 --> 02:10:30.709
Nathan McBride: Got it. So 1 1 Boston place, whatever the hell that is, whatever the most generic address.
1351
02:10:31.470 --> 02:10:32.110
Kevin Dushney: Yeah.
1352
02:10:33.820 --> 02:10:43.830
Nathan McBride: But let's do it. And then, if we will, I'll bring all the stuff. And we can just set up our thing and do a podcast live. And we'll just have, like immediate thoughts about
1353
02:10:44.060 --> 02:10:47.770
Nathan McBride: all of the digital transformation that Moderna is doing
1354
02:10:50.450 --> 02:10:56.290
Nathan McBride: though their revenue is still the same. And you know all that next week.
1355
02:10:56.290 --> 02:10:58.980
Kevin Dushney: Yeah, that's I mean, that's that's really the
1356
02:10:59.270 --> 02:11:04.330
Kevin Dushney: the highlight of the entertainment is you know, how about is alt me gonna troll the modern AI guy.
1357
02:11:04.330 --> 02:11:06.679
Nathan McBride: Yeah, I was thinking the same thing, Kevin.
1358
02:11:06.680 --> 02:11:11.099
Nathan McBride: Both all Nate and Nate are gonna be trolling every one of them
1359
02:11:11.100 --> 02:11:17.299
Nathan McBride: panel about risk. So it's going to be come to Jesus moment about risk you stand by, be there.
1360
02:11:17.690 --> 02:11:19.560
Nathan McBride: be there, be square.
1361
02:11:19.780 --> 02:11:24.580
Nathan McBride: be nice to old people, be nice to human beings have your pets spayed or neutered?
1362
02:11:26.040 --> 02:11:31.850
Nathan McBride: Don't be a dick, especially to it, people. They're trying their hardest
1363
02:11:32.600 --> 02:11:38.529
Nathan McBride: when the help desk Guy shows up at your desk, or or or person shows up at your desk.
1364
02:11:38.790 --> 02:11:45.070
Nathan McBride: and they're just like rebooted not talking either asshole. You actually need to reboot your fucking computer.
1365
02:11:45.380 --> 02:11:47.610
Nathan McBride: So be nice to them.
1366
02:11:48.330 --> 02:11:51.729
Nathan McBride: And then if it reboots it still doesn't work. Well, let them solve the problem.
1367
02:11:52.498 --> 02:11:56.480
Nathan McBride: Give us all the stars, buys a beer.
1368
02:11:56.950 --> 02:11:58.540
Nathan McBride: Go on our slack board.
1369
02:11:58.850 --> 02:11:59.779
Kevin Dushney: Hi! The merch!
1370
02:12:00.730 --> 02:12:14.519
Nathan McBride: Buy the Merch support. Wikimedia, support the Aclu, support life, science cares, support your local favorite charities like wounded warriors, or whomever else you like and be good to each other.
1371
02:12:14.760 --> 02:12:19.010
Nathan McBride: Don't be such a rush to get to work in the morning. By the way, it'll fucking. Be there when you get there.
1372
02:12:19.510 --> 02:12:24.710
Nathan McBride: Just relax right working, going.
1373
02:12:24.710 --> 02:12:25.420
Kevin Dushney: Love, that.
1374
02:12:25.810 --> 02:12:27.459
Mike Crispin: Take your time. There's no rush.
1375
02:12:27.760 --> 02:12:42.429
Nathan McBride: It's work the side of the road. Have a little 4 20 moment. Whatever. Just get to work work is, gonna be there on the way home I get it. You want to get home, get some pasta, some wine, speed up a little bit, whatever getting to work.
1376
02:12:42.430 --> 02:12:44.499
Kevin Dushney: Time for a podcast etcetera.
1377
02:12:45.065 --> 02:12:46.760
Nathan McBride: Don't, don't rush.
1378
02:12:47.530 --> 02:12:48.660
Nathan McBride: You guys do work.
1379
02:12:49.170 --> 02:12:51.780
Nathan McBride: No one fucking loves their job. That much. Stop relax.
1380
02:12:53.770 --> 02:12:58.880
Nathan McBride: You see a white Mercedes on the side of the road. That's me. Having a 4, 20 moment. Pull over. We'll join together.
1381
02:12:59.140 --> 02:13:02.220
Mike Crispin: Nice, if that's who that was.
1382
02:13:03.100 --> 02:13:07.809
Nathan McBride: That's that's that's who it was. That's who it is. Mike Kevin. Great to have you both back
1383
02:13:07.930 --> 02:13:09.739
Nathan McBride: or coming back. Mike.
1384
02:13:10.138 --> 02:13:11.730
Kevin Dushney: As always like always.
1385
02:13:12.090 --> 02:13:14.829
Mike Crispin: Next week. Will you guys finish this mother effort.
1386
02:13:14.990 --> 02:13:18.909
Nathan McBride: And move on with our lives. Alright.
1387
02:13:18.910 --> 02:13:20.289
Mike Crispin: So I'll sounds like a plan.
1388
02:13:20.290 --> 02:13:20.920
Mike Crispin: Love.
1389
02:13:21.360 --> 02:13:24.140
Nathan McBride: All the love, all right, peace and love, peace and love
1390
02:13:24.140 --> 02:13:30.450
Nathan McBride: you both. On Wednesday, at the box AI digital AI summit.
1391
02:13:30.940 --> 02:13:32.090
Kevin Dushney: Sounds like a plan.
1392
02:13:32.090 --> 02:13:33.100
Mike Crispin: Let's do it.
1393
02:13:33.450 --> 02:13:34.040
Nathan McBride: Peace.
1394
02:13:34.434 --> 02:13:35.620
Mike Crispin: Peace out, guys.
1395
02:13:35.620 --> 02:13:36.280
Kevin Dushney: Better.