The Calculus of IT

Calculus of IT - Season 3 Episode 4 - The Verification Economy (Part 2 of 2)

Nathan McBride & Michael Crispin Season 3 Episode 4

In part two, we tackle who actually does all this verifying, what it costs, and the uncomfortable questions nobody's asking yet.

IT is becoming the verification department whether you signed up for it or not. You're verifying employee identities, contractor credentials, AI agent authority, data provenance...basically everything, constantly.

We explore the shift from proactive governance to reactive verification, the impossible tension between privacy and verification, and scenarios that keep you up at night: locked-out executives, disputed transactions, and verification systems that might be wrong.

Big questions we couldn't fully answer: How do you verify AI agents? What gets verified - person, device, behavior, or all three? Can verification be continuous without becoming surveillance?

Mike predicts 2-3 years before verification pressure really hits IT departments. New job titles emerging: Chief Truth Officer, Model Auditor, Verification Vigilante.

We also accidentally invented a post-apocalyptic western about the last verification expert alive. It involves tape backup accidents and Mandalorian trust buoys. Don't ask.

Next week: Who Speaks for IT Anymore?

Support the show

The Calculus of IT website - https://www.thecoit.us
"The IT Autonomy Paradox" Book - https://www.longwalk.consulting/library
"The New IT Leader's Survival Guide" Book - https://www.longwalk.consulting/library
"The Calculus of IT" Book - https://www.longwalk.consulting/library
The COIT Merchandise Store - https://thecoit.myspreadshop.com
Donate to Wikimedia - https://donate.wikimedia.org/wiki/Ways_to_Give
Buy us a Beer!! - https://www.buymeacoffee.com/thecalculusofit
Slack - Invite Link
Email - nate@thecoit.us
Email - mike@thecoit.us

Episode 4 - Part 2 - Final
===

Trance Bot: [00:00:00] The calculus of it,

season 

three,

verifying this identity.

Sometimes 

you just have to take it.

Sometimes you just have to take it

because it's season three

divided Autonomy,

verifying 

identity,

the calculus of it.[00:01:00] 

Mike Crispin: I, I would just say that, say I'm not sure what the, uh, what the issue is. I'm on an iPad with some AirPods on a, on a one gig connection. I don't know what it could be. 

Nate McBride: Do you think it's my Windows 97 drivers? I don't know. On, on my, on my, on my Wingate machine here. 

Mike Crispin: It could be your sound blaster. 95. 

Nate McBride: My sound blast.

You know what? I just can't download the drivers drivers anymore. I'm still using my little mini plug so 

Mike Crispin: I could reboot my iPad if you want. You want me reboot this thing? [00:02:00] 

Nate McBride: Will 

Mike Crispin: that help you think 

Nate McBride: you, you don't have to, you don't have to reboot. Can you hear me okay? I can hear you just fine. 

Mike Crispin: Uh, are you sure?

I wanna make sure that I'm, I'm, uh, I'm coming through loud and clear for you. I'll just, 

Nate McBride: I'll just Auto-Tune you when I do it in post-production.

Mike Crispin: How are you doing, man? You doing okay? Yeah, I'm doing great. How about you? I'm great. Yeah. No complaints. Things are good. Things are good. 

Nate McBride: End of the year, only a couple more days left of work. 

Mike Crispin: Yes. Yes, 

Nate McBride: absolutely. We had our, um, year end, well we do, we, we do an IT feedback survey twice a year in the spring, and then again, uh, in December, well November to December.

And, um. It's always like the same questions. It's pretty, it's, I, I like the survey and we usually get a pretty good response rate [00:03:00] 'cause we give away free stuff Anyway, um, this, you know, there's open feedback, it's anonymous, there's an open feedback, uh, section. And this year somebody wrote in the, wrote a question, where do you see AI evolving in the near future?

And how do we navigate AI in our day-to-day work lives? Yeah. I was like, that's a really good question. And I never really answered that. So I'm hosting a town hall tomorrow, uh, from 12 to one to answer that question. And as I was putting together this deck, I was like, and I was like thinking about our company, and this is probably true for most companies right now.

Yeah. Biotech, you could literally not use AI at all and probably get through the next two years of your company's existence with your current milestones and strategies without any problem. True. A hundred percent. You could just not even turn it on and still do just fine. Everyone's already planned for that.

No big deal. It's what happens in 2029 or [00:04:00] 20 28, 20 29, where I think everyone immediately pivots to like holy shit mode. Um, I'm trying to think like how much time does, does a biotech company really have, legitimately have before? Um, why does your screen keep shaking? 

Mike Crispin: Uh, closing a bunch of windows? 

Nate McBride: So like how much, how much time does the company legitimate legitimately had that hasn't adopted Gen AI yet?

Yeah, in some way. In some way whatsoever. I think there's now like a clock on this 

Mike Crispin: until 

Nate McBride: the world blows up. Yes, that too. 

Mike Crispin: Oh, 

Nate McBride: um, that too. So anyway, I'm pretty excited to give this town hall 'cause it's, uh, it's a good way to answer the question. Will my, will my job be there tomorrow for people? And of course I'm gonna tell everybody that no, none of their jobs will be there tomorrow.

So, just to give up. No, I'm just kidding. 

Mike Crispin: No. [00:05:00] See, I have this like, you know, me being the eternal optimist, I think the expectation will be that everyone does 10 times more. 

Nate McBride: Yeah. Your optimism makes me, makes me wanna vomit in my throat sometimes. So, so, so just like, and then choke on it. You, 

Mike Crispin: you might not have a job if you can't do 10 times more.

That might be what the problem is. 

Nate McBride: Yeah. Or if you can outsource your own job to a generative AI that's part of your own consulting firm that you then pay Yeah. Tax benefits and that just becomes a recursive, recursive loop. Then your Gen AI agent outsources their work for another gen AI agent that they make so they don't have to work.

It's like inception. Yeah. It's inception. All right. 

Mike Crispin: It's layer upon layer of, yeah. 

Nate McBride: We'll see, feeling incepted. Um, so I was gonna ask you just now how Christmas was, but we're still a fucking [00:06:00] week away, so, so don't answer that question just yet, because I won't, I won't. We can, we can talk about last year's Christmas, but, um, I, I have no concept of time.

I was also gonna ask you how Thanksgiving was, but we already talked about that last episode. So we did, uh, this is, this is what happens when I take a week off of the podcast. I don't know where I am. 

Mike Crispin: We had, it was kind of a, a lull, right? So there wasn't much what was going on. Did you go to any big dinners or anything?

Like have any 

Nate McBride: Yeah, a bunch. We had the OG dinner 

Mike Crispin: Oh, cool. For 

Nate McBride: everybody. Um, not the OG dinner we used to have for TKT, but the other OG dinner. And then I had another dinner. Yep. I can't remember what that was for. And then I had another dinner on Thursday night at a Oma Oma, uh, restaurant in Lexington.

That was awesome. [00:07:00] Where is that? Which place was that? Uh, you said the name I remember. But it was, it was awesome. It was a single, it was a six person seating. Um, yeah. Uh, it's right in, right in sort of Lexington Downtown Center. Anyway, it was really great. Oh, okay. And then excellent. Then, and then another dinner, uh, yesterday.

Wow. Lots of dinners. Lots of dinners. I probably need to go on a diet 'cause uh, this, it's just that, that's time of year, man. December. Everyone wants to hang out with me. So what am I gonna do? Are you having a team dinner? 

Mike Crispin: Uh, we'll do like a team lunch or something. We're gonna, we're gonna go out with, uh, with the CFO next, after the first week of the new year.

So we were supposed to go last week. Yeah. And uh, and I was home with the sick kids, so we had to move until after the 

Nate McBride: Are you gonna get a team bucket of fries and share it? 

Mike Crispin: Yeah. Yeah. I'll probably get some of those little [00:08:00] hotdog pastries wrapped up. You know, those little hotdog wrapped in pastries pastry, yeah.

And then, um, maybe a couple bags of potato chips and um, you know, a few diet Cokes. 

Nate McBride: I got your Christmas card, by the way. Oh yeah. You liked that. Um, did you use Claude to make this or Jen? Like who are these people? 

Mike Crispin: I did not make that. 

Nate McBride: Did you have I make them. 

Mike Crispin: I did not make that. It's all real. There's no AI at all.

Really? No ai. Your son's got nine fingers. No, that's, he really is real fingers there. That's in real life.

That's cool. Now, don't you feel bad. No. I use 10 fingers. 

Nate McBride: We're sending out, uh, cards for, from the Human Fund, uh, this Christmas. Yeah. Donate to the Human Fund on our behalf. That's right, that's right. Uh, I don't know. I don't do any of that stuff. That's, uh, all out of my [00:09:00] jurisdiction 

Mike Crispin: to me, I, that was the same thing.

I, I took the picture of us on a bench in Nantucket. That's what that picture of us is. And, uh, that was my con contribution. And then everything else was Good job, man. I don't know. Yeah, I took that picture, actually, I didn't take that picture. I handed the camera just so I, someone to take that picture. So I, I asked them.

I really like the 

Nate McBride: Unabomber capping glasses. It's really, um, 

Mike Crispin: yeah. I tried to put a little ominous edge on the Christmas card this year. Ah, nice. You know, just a little bit of happy holidays. You never know what comes next type 

Nate McBride: next year. You should just send out a upside down pentagram.

That'd be pretty cool. I mean, that'd be edgy, right? Yeah. Be edgy. I mean, 

Mike Crispin: you know, I just gonna dress up as Willy Wonka in the next one. Actually, I dress up as Grandpa Joe, where are my presents? Where are my, can I go too? [00:10:00] What about me? 

Nate McBride: Uh, 

Mike Crispin: yeah. 

Nate McBride: Well, Mike, oh, well, grandpa Joe. 

Mike Crispin: Grandpa Joe, 

Nate McBride: I love him. We are, uh, uh, this is cousin it, and I'm with Grandpa Joe, and we're bringing, bringing you live from the upside down Pentagram, the calculus of it.

Uh, season three, episode four, part two. Yep. Love it. Got that. All right. I had to write it down. Um, remember where we were in place? We only have 44 episodes to go, Mike, so we gotta get going. Great. 

Mike Crispin: Let's keep going. Let's keep moving. Are we taking a month, a month off after this one? What's, oh, well, next, what's 

Nate McBride: next week is like Christmas 

Mike Crispin: or something.

Next week is, we're off for Christmas, but then we're back kind of right after that. 

Nate McBride: You wanna, do you wanna do like a late night Christmas Eve podcast special 

Mike Crispin: Pull an all-nighter, like do a [00:11:00] marathon podcast. Go all crew Christmas Eve. Yes. Broadcast Live. 

Nate McBride: We should play that, that shitty Paul Bacardi song just on repeat.

And that'd be great. That which was finally, somebody finally got the good sense to label that as the worst song ever made, ever. And then also, worst Christmas song ever made, ever. Like the radio stations are pulling it now. You don't hear it anymore because now it's, uh, it's like the scar letter. Uh, I don't know how he got away with that, to be quite honest.

I mean, besides the fact that he probably got help from Phil Collins 

to get it through. 

But that is the biggest piece of dog shit song ever comes on. It's if we can't change the, the button fast enough. Um, anyway, so yeah, this is the calculus of it, uh, podcast. Uh, you can find us online on the line at the, the C OIT T us and [00:12:00] we have a, we have a substack world where you please keep all of our stuff.

Yeah. And that's the calculus of it. Do substack.com. Uh, so I wanna do a quick recap of last week, or no, sorry, two weeks ago. 

Yeah. 

And, um, Claude was so generous to help me with this because I went through the transcript and I really didn't remember. I was like, yeah, this is what, this is what I wanted to talk about.

And I was like, what the hell did we talk about? I have notes over here on this piece of paper and this is what my notes say. Ready? I'll give you my note. It says, this is one note, one note you're using. One note, it says, fuck privacy. I'm not kidding. Really? Fuck privacy. Yeah, right there. Oh, beautiful.

That's beautiful. And then underneath that I wrote episode question [00:13:00] mark. So. I don't know. We must have said something about that. And then I also wrote, you can Never Change Your Signature. I don't know why I wrote that. So that's all I have. Oh, I also have Dream Stream. Dream Stream and Dream States. Yeah.

Chief, chief Truth Officer, those are all my entire notes for, um, last week. Usually I'm better at it than that. 

Mike Crispin: Well, it's insightful notes. I mean, that, that's kinda what we talked about, right? For the most part. Well, I 

Nate McBride: wanna, let's get back into the dream. So anyway, I used Claude to, to synthesize our, um, transcript.

Yeah. I trans, I transcripted through Claude. I asked Claude to transcribe my transcript and it did that for me. And so here we go. Uh, here's the, the, according to Claude, here's the core thesis. Okay? We're entering an era where proof of X becomes the fundamental currency of digital interactions. Not Bitcoin, not tokens, but proof itself.[00:14:00] 

That's right. And it leaders are the ones who make it work. Wow. That's pretty deep. And then, then we talked about verification. Why verification matters now more than ever. Yep. Um, because of five converging forces. Uh, did you say that we're converging? I may have, I have to go back to the transcript. Anyway, we have, uh, deep fakes.

We have, uh Yep. AI generated content flooding. So now, now we've seen the reports, Google estimates now 50% of web content will be AI generated by 2026. We have social engineering bot networks and trust erosion. Uh, 60% of LinkedIn accounts are now fake. Um, so we've moved from trust by default to assume it's fake until proven otherwise, which I actually kind of like.

Uh, so the five, the five pillars of verification. One was identity verification. I dunno why I came up with pillars, but identity verification, which is to [00:15:00] say are you, are, you are who you say you are. And for identity it was traditional methods breaking passwords are dying, emails are compromised, IDs are deep faked and biometrics spoofed.

This is pretty dire. Uh, oh, we, we talked about new remote employee starts Monday. How do you verify them? We talked about employee calls help desk to reset MFA, how do you know it's really them? And then we call talked about contractor with sensitive data access. How often do you reverify? I feel like we should come back to that.

This is a good one, but 

Mike Crispin: I don't wanna say, I think, uh, on the, on the, the high, like the, the bigger picture in terms of what the, the, the verification roles will, will be and the outside even of it. Is that, we'll, it'll be, I think right now there's a huge risk, risk aversion and for us to protect and to be proactive.

I think it'll be very difficult to be proactive. So the verification will all be reactive. It'll [00:16:00] be the reverse of what we do in so many ways now. Yeah. Is that we'll be, it'll be more about the best way to ask for forgiveness than to proactively protect and just, I mean that's kind of, in some respects, I think where it's going is, and that's why those proofs will need to stand up and will be at a premium, um, distant kind of the, the trust premium or the proofs premium.

I like that. Its like a supermarket or something. I feel, 

Nate McBride: I feel like I should have just written all that down. That was pretty, pretty deep shit. And uh, 

Mike Crispin: that's okay. We'll see we got a, we got a transcript. The, that's, that's reflect on 

Nate McBride: later. 

Mike Crispin: And the question outta all that is if you're reactive all the time, if you're reacting Yeah.

And you're responsive is, is, is go in some, I'm not gonna like governance whittling away. Is it? 'cause governance is asking for permission and verification is asking for forgiveness, [00:17:00] right? Right. Yes. So the whole, to turn the whole thing on its head, if you've got a thousand of these agents doing all these things Yes.

And your company wants to move fast to keep up with the next guy in 10 years. 'cause I agree with you. I think you can run your whole business without AI probably over the next year or two. 

Nate McBride: Yeah. 

Mike Crispin: Is is just that once you get to that point. The whole way that we operate and run will be mostly based on verification and response as opposed to just like we're behind on cybersecurity.

The biggest pillar of NIST now is response and how we react more so than it is how we protect. And I think that that's gonna help with AI because someone's gonna say, just do it. Just do it. Right? That's what happens with all governance. Just get it done. Can't we just do it this time? This just wants, it's an emergency.

Why can't we just do it? And people will just do it. And there'll have to be a mechanism in which we can respond and we can verify and we can ask for forgiveness. Um, keeping up with the Joneses, if you [00:18:00] will, then who does it the best? Well,

and there'll be tons of jobs. I think there'll be tons of jobs for that, I think. And there'll be attractive jobs. 'cause you'll have to be creative. You'll have to be able to just be creative with how you prove things and how you describe and build relevance. It's gonna be interesting, but those jobs are yet to be created and, uh, it's gonna be an interesting time.

I think as a, as this transition happens and we see how much we really trust these agents and these tools, if we really do, if we really do or not, and at the time will tell, because right now we're in this disillusionment phase now where it's like, oh, I don't know, like this is, we, we, we, we should probably take a.

Slow down a little bit here. That's where we're right now. Trusting, 

Nate McBride: trusting agents means admitting something right now. Yeah. Uh, the, the next thing we talked about was hu humanity verification, which is the, are you even human [00:19:00] question? So, uh, cap is die. Capcha is dying. Well, it's been dying for years, but AI solves better than humans.

Now the paradox we talked about was we need to verify humanity for something. Yeah. But ai, we want AI agents acting autonomously for others. The uncomfortable future is that employees may start using AI to attend meetings, responding to emails and completing tasks. And so therefore, are you employing the human or are you employing the ai?

Um, geez. Amazing. Alright, then we, then we talked about authority verification, which is, are you authorized to do this? Uh, which is, we talked about this in last season. Yeah. But most companies verify identity constantly, but verify authority. Almost never. Um, after the initial access has been granted, authority is con, authority is contextual.

It changes frequently and, and can be delegated, including to ai. And [00:20:00] then, uh, we, we, we talked about, we verify who you are, but rarely, we rarely verify if you should actually be doing what you're doing at some future point. Which is, 

Mike Crispin: yeah. I think, I think that, that, that has been the case for a long time.

It's like, at least from an IT perspectives, we often. Gain, you gain new entry to the, to the, the, uh, the garden, if you will. And what you do in the garden is up to a bunch of other people. Yep. And once you're in, you're in. And if someone owns a piece of data and they share it with that person, we've let them in the door.

But we kind of relinquish a lot of control once people are in. And that's why, you know, the, the penetration tests and cybersecurity pen tests that are internal are very interesting at times. Because once we've always let people in the door and once they're in, you're doing any social engineering and internal pen testing, you start to see, oh, shoot.

Like maybe we do need to have more oversight on what we do [00:21:00] inside in terms of granting access and attributes, read, write permissions, those type of things. 

Nate McBride: I still come back to the question of why do we give everyone the ability to create? It's a great, great question. It's a great question. We stopped doing that and we, oh my God, we stopped giving everyone the ability to create shit.

It's a whole different ball game. Uh, I don't know what ball game it is, but it's a different ball game. Uh, we talked about, uh, Providence verification. So did you actually create this Yep. Uh, question and then for chain of custody in that area, uh. What, like what types of content will you actually care if a human did or did not generate it?

Because there'll be content that you don't give a shit whether who did it, and there'll be content that you very much care. And the question is, a, well, the questions are why will you care if you care? And B, what do you [00:22:00] do if something that was, was generated through the wrong chain? And so then the question becomes, if someone uses AI for 90% of the deliverables that they make and adds only 10% of human touch or qc, did they create it?

And do you care if it's good? Uh, these are all like that whole, the whole providence question. Just like, geez, everything else could be its own fucking episode. Because yeah, if we do get to a point where people become very, very good and efficient at, at generating, they become these sort of QC endpoints, right?

For all the data. Um, those are the verifiers. Yeah, the verifiers, right? The verifying role. Yep. Um, then how do you determine what's good or what's not? Uh, that's a, that's a big question. And then, um, 

Mike Crispin: still need the human in the loop. You still need the human in the loop and the expertise in the loop. That's still human.

I think we'll still need that for some time as long as we, we don't [00:23:00] fully trust these models or these tools. Um, 

Nate McBride: yes. And the last part we talked about was, um. The expertise verification, which is, do you actually know what the fuck you're talking about? I mean, we all know about, we all know about explainability.

And now you can use prompt chaining if you're pretty, if you're good at it, you can use prompt chaining to show the, the path of thought. Um, we use the plumber metaphor, remember that, where, you know, if, if the plumber comes in and they solve the problem by standing at your sink and Googling it and watching a YouTube video and using their wrench, are they still a plumber and are they still the best plumber or are they the best plumber because they knew where to find the answer?

And it's no different than saying, is that the best IT person or is it the best it person? Because they know how to Google the answer. Uh, remember talking about that. And then could you, and then the question was, could you basically hire any person off the [00:24:00] street, uh, anyone if they're good at, at asking prompts to do any role?

Um, that one was, that one was a good conversation. And I remember we talked about that at length, and it's still obviously well worth exploring if we wanted to, but then we broke down the verifiers. Into five categories. We talked about governments, uh, digital IDs and passports, which are now becoming very, very complicated.

Uh, platforms which are mostly broken third party services like, uh, ID, me and clear, uh, worlds 

Mike Crispin: Coin 

Nate McBride: World and World Coin. Yes. We finished up with Sam Altman's, Benoit Balls, uh, talked about cryptographic methods, and then IT departments, uh, IT departments are now going back to becoming the first line of defense, oddly enough.

Um, but then it brings us back to the very first question of, well, if some, if you've never met the person before and they're [00:25:00] starting in the remote, how do you know it's them? So, you know, everything has, we talked about trade-offs, like more verification equals more data collection equals less privacy. So the surveillance keeps, starts with prove you're human and end with verify everything you do, and can you have privacy and verification.

Uh, but we determined it's expensive and complex, um, accessibility. Not everyone's got government IDs, smartphones and biometrics to scan. Well, there's an inability and the inability to verify will become an inability to participate in society. That's a fucking big one. And then, yeah, who gets left behind homeless, elderly, disabled, low income people from countries without digital ID systems and so and so, within a very short amount of time.

Perhaps say the next five years, you could see literally a split right down the middle. Well, not in the middle. Uh, it'll be like maybe one 10th is in and nine tenths are out of, um, data. It's e EEG industry 5.0, [00:26:00] uh, the Ray Wang version. And then we get to performance friction, which is, we talked about low friction equals happy users, but high risk versus high friction equals secure, but frustrated users.

And then, uh, generally the data shows that 87% of carts are abandoned if a secondary verification is required. Uh, which is why Stripe does so well. 'cause Stripe immediately detects you, it doesn't make you do the second login to verify yourself. Um, I think there was some data, by the way, a side note I was trying to read about CBV numbers, like how many people remember them.

Uh, and I, there was a staggering. People don't know the CBV number of their primary credit card. Most people know it. Uh, most people know the, some, some are all the, the actual credit card number and some are all the date. But few know the CBV number because, uh, ever since Stripe started including it in your [00:27:00] storage, uh, they forgot it.

I remember you just remember my whole AMEX situation. I knew every single digit of everything, every time I had a new card. And then I was able to add two cards to that memory. And now I can't remember my CBB numbers. I gotta look 'em up all the fucking time. Um, the economic cost verification services, one to $5 per verification for id me type services, background checks, if they're $30 to a hundred per person, biometrics can be 10,000 to half a million.

Staff time, infrastructure time, compliance costs, and most it budgets have no line item in their budget For verification, we have a cybersecurity budget. A budget usually revolves around infrastructure. It revolves around platforms, but we do not specifically call out verification. And would we ever, your big prediction from last episode was we're moving from a governance economy asking permission to a verification [00:28:00] economy, asking forgiveness, as you said, jobs will be created around retroactively, verifying and fixing things after they happen, which was, which was important enough for Claude to point that out.

We have new roles coming, the chief truth officer and the, the model auditor. Model auditor. Claude couldn't figure out what you were trying to say. Um, and yeah, so we're getting in. That was it. That's a summarization of episode. Well, episode four, part one, and now we're gonna freaking strap it on and dive into part two.

Um. And so I had a starting point here, which is, um, on the verification point, what happens when verification fails? Great question. Right? I mean, so obviously you know about the fucking inane [00:29:00] wackadoodle bill that's trying to be passed to, to show expats, or sorry, people coming in for visa that thought last five years of social media and how that can easily be spoofed.

Um, the verification does fail and verifications for those people that are trying to come into this country because they could face serious problems if they can't verify everything. Uh, but when it comes to an employee, what happens if that's a legitimate employee but you just can't verify them? Yep. Or, or it's disputed.

So like edge cases for this, think about it. So you have false negatives. Mm-hmm. So an employee, employee uses compromised credentials, system says verified, but it's actually attacker, um, who's liable, right? And then, yeah. Yep. A false positive, legitimate employee can't verify, uh, who's liable. Uh, disputed verification employee says, I didn't do that.

System says, yes you did. We verified it. Uh, we have this a lot actually happen [00:30:00] with box. Uh, box gives great file analytics. Um, you know, if you click on a file, you can see the analytics of that file and kind of drill down into who's downloaded it, viewed it, whatever. Uh, we had somebody, um, contact the help desk the other day saying that why is so-and-so looking at my file?

And I said, yep, well let me go look. And sure enough, this person had viewed the file like a month ago. One time. 

Mike Crispin: Yep. 

Nate McBride: I said, well you gave that individual editor rights, right? Like I did. Yeah. Look over here, it's editor rights. Oh. And then they got an invite to your document and they compare and click the link.

So I go to that employee and I say, did you get an invite for this? And click the link? They said no. So look through their email and sure enough, there was an invite there, but there's no way to tell if they actually clicked it or not. I was like, okay, it's not a security incident. Someone you, you clicked it, you just don't remember 'cause you're doing a million other things.

But it was one of those cases where system says you did, you say you didn't, [00:31:00] but you know, evidentially, right. There it is. Um, so we have some people traveling, uh, in January to some countries that are potentially hazardous. And I was thinking about this too, which is you have somebody who's an executive traveling internationally, they need to approve some kind of decision and the verification system says it's not you.

Mm-hmm. So these are all the sort of things I was thinking about for a verification, disputing. Uh, and how problematic that could be. So you would need a way, a secondary channel of verification when your initial path of verification failed if you wanted to use the verification, uh, as it true means. Does that make sense?

Yes, it does. What are your thoughts on that? 

Mike Crispin: I think when verification fails, it's, it's, it's more of, you just give an example in box, right? So when verification [00:32:00] fails, we have to figure out ways to be able to hold, I dunno the best way to put it, but kind of hold the line from an accountability perspective, right?

If something doesn't work out, either we stick by the verification model or we don't, and sure, I, I imagine there'll be exceptions to the rule, but when something can't be verified, I mean, I, I think this the, maybe it's a, from an identity perspective, there are other ways to identify people like we've talked about, you know, proximity or meeting them in person, or we've talked about, um, you know, trusting a, another peer in the company to be able to identify the person.

Um, yeah, there are ways to, but data is a, a, a different piece. Like just strict, you know, let's say structured data, unstructured data, um, you know, if it doesn't, it doesn't check out with [00:33:00] whatever proof that's been built. Then it's, it's not good. It's just like if we, we can't, uh, efficiently run a pq, it fails the, the qualification.

We don't do it, you know, uh, it's kind of the same thing. We gotta fix it and test it again. So that could take, that could take time. Um, but we gotta stick by that. We could just kind of plow through it. And I think that's where these roles are gonna become more plentiful. Yeah. 

Nate McBride: But doesn't, doesn't that kind of back us back into that original Yeah.

And that topic of, of, of privacy and verification. Like, okay, how much do I have to get from you to verify it's you, 

Mike Crispin: uh, see we've gotta have a, a multi multiple factor way of, of identifying people, whether that's, you said you used the safe word example last week, right? Like we, if we've got, if we've got more than one way to do that, I think that's [00:34:00] how it will, it will be done.

There'll probably be some more intricate tools and there's a ton of software companies trying to fix the identity problem. Um, yeah. But yeah, I think it's, what's the real, 

Nate McBride: what's the problem, by the way? 

Mike Crispin: So I think, I think it's simplicity and like when someone calls into the, the service desk or something, having a, a number that they can check that only they know, or a number that's, that's changing.

Only they can, that only they can see that's shared. So it's almost like a public key. Yeah. That's shared with the person on the other side. And there haven't been a lot of great ways to do it. I realize there are some good ways to do it now, software tools and whatnot. But someone called in, it was kinda like, okay, what's your employee id?

Or, you know, send me a message on Slack or, oh wait, maybe that's been compromised. Or, so if someone's been compromised, the electronic tools, um, may not really work. So it's almost like you gotta have something that's [00:35:00] out of the loop that's a verifier. And that's the challenge, I think is, you know, having something that is outside of the ecosystem and architecture in which is almost personally owned or personally identifiable, that is still publicly sh I don't, don't mean publicly sh I mean publicly shared within a company, but a public key type model where that can be verified by the person the other end.

Yep. That's the correct key. Um, I'll let you in. And it's shared. So for each user in each transaction, there has to be one of those. And I think that creates complexity and confusion. Uh, almost like when MFA emerged. I mean, remember how painful that was? Well, 

Nate McBride: but that at the time, yeah, there was, uh, MFA was weird.

I think more so because it, it. In the order of events for us, or [00:36:00] at least from a technology perspective as I can recall it, we had, uh, your first two FA token, anybody would've had, would've been RSA token all, uh, 2005, 2006. Sometime in that timeframe, you, you know, RSA tokens became sort of dere Gore, but you had to go through a shit ton of hoops and hurdles to, to link it up to a platform internally, and you couldn't do it through for all your platforms.

There was no SSO then you have SSO come in and it was internal SSO only, um mm-hmm. Which, which required a a D and it was just an appliance. And that was like circa 2008. And then SSO becomes strong in 2009, or it emerges anyway, and the problem becomes how do we get these two things to get together and also get rid of this damn token?

Yeah. 

And it wasn't until I think, maybe two or three years later, if I'm remembering like 2012 or so, that you started to have MFA, although it still wasn't called [00:37:00] MFA at the time, it was F two FA. 

Mike Crispin: Yeah. 

Nate McBride: Uh, but, but it was two FA with SSO and then it became MFA, but yep, we st at that time we all, we had minimal data collection, so mm-hmm.

In order to to, to do MFA, you really only needed an email address, um, and a password, and then your token. Yep. If you have verification. That's probably not gonna work. I mean, if you get into verification Yeah. You need everything that Mike's ever done to prove that it's Mike. Uh, 

Mike Crispin: and, and, and ver ver verifying is hard because you're telling people not to, to trust anyone.

Right. You know, with the, with the data. Except for it's a zero trust model. Yeah. You tell 'em not to trust and then you're saying, well, we're gonna need some way, some information from you to verify yourself. And like today, like if you wanna buy Celtic season tickets, for example, you need to [00:38:00] go to doc the DocuSign, scan your id, do a verification process to make sure it's you, the whole thing.

Yeah. I don't wanna give my ID to the Celtics. 

Nate McBride: Right. Exactly. 

Mike Crispin: Even though it's not going to them. But if I, I know that's not going to them, but it's going to some system that checks the box and sends it. Oh yeah. It's okay to let him in. But that's, you tell, we're telling people to be careful of what they share, and now there's verification services that are saying, join LifeLock, send all your shit to us and we'll protect you.

Yeah. You know, like, so there's all these verification services to protect your identity, uh, and where, and, and you're gonna have to give something up, at least the way things work today, which is why there's got to be some identity platform or sort of blockchain model that's gotta be out there in the future that identifies people and that that is open.

That's not. Some business. And I just think that's the next open source project. Like, [00:39:00] uh, Linux was in the nineties. Someone's gonna build an open source identity framework that's not for profit, is for public benefit, and is just adopted open source and built across globally. And the closest thing we have to, that, that's operating at that level is, uh, is like Bitcoin.

Yeah. You know, and that's not identity. That's, that's transactional. So I mean, but something like that, if someone could come up with that 

Nate McBride: mechanism, uh, well mechanism, well, bi Bitcoin be isn't, Bitcoin isn't as is isn't as anonymous anymore as it used to be. Yeah. Sure. Ever. But that, but that, 

Mike Crispin: but that ledger, that ledger like model, if there was a way to, to use that to, to, to hold, uh, that, that uh, verification code that links you to the, the name, it's open, it's hard to be tampered with.

Um, [00:40:00] that would be very interesting for someone to build that. It's just not a pay it forward GPL like community based mindset on these things anymore. Everything is for profit. 

Nate McBride: You're, you're going the, I mean, the direction I hope it goes to with this, which is getting away from a surveillance state of. Of verification.

I mean, ultimately there, there was a great article, I think it was on Blood in the Machine about a, a guy who was just walking back from the gym, uh, got thrown into an ice car for an hour and a half 'cause he didn't have his driver's license total legitimate, um, in every possible way, American citizen. And finally they were, and they were taking pictures of his face and comparing against the ICE database.

And finally they, uh, confirmed that he was not, you know, an illegal alien or whoever it is that they feel entitled to chase and let him [00:41:00] go. And just like that, you know, just threw him outta the van. Um, and that's the verification to verify him is obviously intentionally based on a surveillance type program, but where does the line get drawn between me verifying Mike and me having to do surveillance of Mike to prove Mike is Mike?

It's like that, it's like that old question, Hey, prove to me that you're 21 years old without revealing your birth date. Sure. Like, prove it. Well, it's fucking impossible. Um, not impossible, but it's pretty, pretty damn hard. Same principle, like when is verification just become inherently surveillance?

That's, you know, that's, that's a big concern too on, at least on the march towards, um. The verification economy. Yeah. And who, and who gets to, well, this is kind of a rhetorical question, but who gets to not have to be [00:42:00] surveilled versus who does? I mean, unless you're a white, rich male, uh, I feel like you're gonna be surveilled in this country in the next few years, like permanently surveilled.

But, um, you know, it's just one of those questions then again in further to that, or a subset of that, I guess, is what's the minimum data needed for verification? I guess maybe that's the real question. What's the minimum for any situation? Again, if an employee comes into a company, what's the minimum I need about them to verify that they are who they are?

Is it, and we talked about phy, remember we talked about physically meeting the person last time? Yep. Uh, how that's easy, right? Sets up that initial meeting. I know you, we've met, we've talked, uh, I spent enough time with you to understand your mannerisms, how you talk. Maybe that's a little bit too much, but, but if I never met you, what would be the minimum?

You know? 

Mike Crispin: Yeah. [00:43:00] It's, it's a difficult question. And I think in terms of me using meeting someone in person as a measure, there's, there's a what that between those two people, there's a verification that happens. You meet that person and you say, yeah, you are who you are. But that, that, that, that's a temporary transaction that's not saved, it's not understood, it's not built.

That doesn't compound that information over time. And then if you do compound the information, there's a question about privacy. You know, people, people don't want everyone to know that, Hey, I've, you know, I, I have three friends, Nate, Billy and, and Mike, and, uh, you know, I met Billy three times last week and Nate eight times last week.

And Billy knows that Nate Mo met, uh, that guy eight times last week. You know, there's all these sort of privacy related things with building a model around that. [00:44:00] People may not want people to know that. So it's kinda like the, the Zero trust meets verification and how, how that's all gonna work out because to really verify people, like take, I know we joke about world coin, but take the iris scanning model, who wants their eyeballs scanned?

Right. You know, who, who wants their face scanned, who wants, you know, these type of things to happen. Some of those, some of these are great verifiers, but you're going to the, like you said, the, the risks of surveillance, the risks of getting into the wrong hands and being used the wrong way. Uh, the acce, the accessibility for people to actually be able to do it safely and to have the, uh, opportunity to do so.

That's, uh. It's, it's almost a paradoxical thing right now as you're saying, Hey, we want security, we want privacy, but in order to better verify truth and information, privacy, and that might be, we say, uh, privacy [00:45:00] in your, your notes, privacy has to start to erode. And, uh, governance, governance to some extent, just not, not by design, could potentially erode.

And that's, it really depends on how we work and how we handle information in the next decade as to how is it gonna be go, go, go. I, I just, I, I want to get this done and I wanna do it faster and better, and I'm willing to take the risk. Um, or is it, look, we need to protect people's privacy. We need to make sure that data is, has integrity.

We need to make sure that, uh, we're being safe with our information. So I think, hold on. Those two things. Collapse and then somewhere in the middle. 

Nate McBride: But let's back up a minute. So you're, you're doing what I, I, I tend to do, which is, um, conflate privacy and verification again. I mean, yeah, yeah, yeah. I am.

Yeah. So, so we need, we need to find a [00:46:00] balance between. Mike's privacy and Mike's ability to do things. That's right. And, uh, Mike has a choice to make. He can, he can concede all of his privacy and have a hundred percent optionality, or he can concede none of it and can't even buy something. Yeah. Uh, you know, um, geez, this is the Yeah, you're right.

It is a para, it's paradoxical. It's, it's a paradox. And I mean

Yeah, because it comes right back to the privacy question. You can either, uh, you can either do like what Twitter does, which is just buy verification. I mean, it's just cost you money and you're verified. It doesn't require you have, but it doesn't 

Mike Crispin: hold a lot of water either. Right. I know those verification's don't mean anything.

Nate McBride: It's meaningless. And then you can have, you can have super duper crazy ass [00:47:00] verification like, uh, when healthcare.gov launched and nobody could get in 'cause it was like so many different ways you had to enroll. And then you kind of have like that goldilock zone where you have like Amazon, I mean, I hate to give them any credit for anything 'cause they're, you know, evil empire.

But it's one click ordering. Yep. Um. Everything about me is there. Now, obviously the, the risk is that it's so easy to destroy someone's life by getting their account. Yeah. 

Mike Crispin: Uh, I think the, one of the way, like we were just talking about the open source and, you know, we talk about decentralization a lot. It, it, it may be that we really need to get to decentralized identity and that needs to what you care.

Just so we talk about bringing a little AI assistance to work. You're bringing your identity to work and your digital identity. [00:48:00] Your digital identity identity personally can be leveraged by businesses. And that could be the middle ground, the middleware, if you will, that protects privacy and that you can bring in.

And who's gonna do it? Who's gonna do it? Well, is it your Amazon account? 'cause everyone has one? Or is it, you know, is it one of these big, big, uh, vendors, you know, that's, that's gonna just build a sort of a model on top. And, you know, it's, um, the, the, the, the other, the other question is, you know, on the verification side is, is there going to be data that's created that once it's proofed and verified, it can no longer, it can no longer be consumed and, uh, manipulated or edited by an AI model?

Let's just say. So let's say like, uh, like cryptographic providence. Yeah. So the, the creation is, is, is, is protected by. By some sort of construct. [00:49:00] Yep. Once it's been verified. So after the fact. Okay. Yeah. That's, that's a, that's a proof. We can't touch that anymore. We'll put that in the proof 

Nate McBride: box. This, this has already been this, but this has already been the A POC for this already exists with, um, NFTs.

I mean, the idea that Sure. I think can only exist one time in the world, ever. Even if copies are made, the original can only exist one time. That that's a construct that was proven. It was made into a joke. And, but, but it was proven. So you, you're right. I mean, I think that's a great idea. This cryptographic provenance, the, I just, by the way, made a note about, we're gonna have to have an episode on, on the quote unquote mothership.

And, and just what you just mentioned there, the ability for you to take your verification and identity and go from place to place that like there's no, it eliminate me for directories. Now, when Mike, that's right. My, my, he's shipped my company. He's Mike Crispin. He's fully verified. He's got this id I already know.

The prominence is, is perfect and everything. He just bolts [00:50:00] onto my mothership and now he can access the things that Mike needs to access. Um, honestly, it's very, uh,

it's kind of, it's kind of not too, too hard of a thing to, to realize technologically That's right. If, I mean already I can add someone from outside of my domain to a platform and they can be part of my, I can bring them into my domain, even with their own email address. And so I've got the idea behind, okay, adding Mike Crispin at the co it us to my Airtable domain, because I know him and I know that's his email address.

I can add him in even though he is not part of my company easy. So if I can just, if Mike can get his Coit US address and just take it with him everywhere, and he no longer needs a company domain. Now for the sake of email continuity, if someone wants to contact Mike, [00:51:00] they need to be able to contact him at the mothership address, but it's just gonna bounce to his private identity, uh, or his, his verified identity.

Mike Crispin: I, I would still keep those things separate. It's the identity piece that you bring to, to work with you, your sign in on any, like, take, take for example what, you know, Google, Facebook, Microsoft. Now Microsoft splits them in two. They have a, a personal, you know, Microsoft account and they have the work, work and school Microsoft account.

But if you take Google for example, you know, you could, you could log in. I think that this is again, a privacy thing, right? So if you take your Google account and use that to log into your work account. Oh no. Does my work account now? You know, is it linked to my personal stuff and oh yeah, I can verify you who you are.

'cause you'd never lose that account. That's your personal account. That must be you. Uh, but you're giving up potential privacy and you know, from legal perspective, you probably [00:52:00] wouldn't wanna have those things mixed together. And the same thing happens with LinkedIn Learning. People will be like, even though it is truly separate, the the mindset is I'm not gonna share my LinkedIn learning account with my work.

Right? Because, you know, if someone's contacting me for a job or something, I'm afraid might see it. Even though LinkedIn goes to great links to try and separate those things. It's pe That's why a lot of LinkedIn learning drops at a lot of companies is because you are asking employees to log in with LinkedIn to a company resource and people go, uh, I don't know if I wanna do that.

So I, I think there's that. I think it has to be identity only. Um, it's not a sexy thing, but it's linked. It's, it really just verifies you as a, as a person that you are who you are and then anything can be hacked. So, yeah, I get it. Um, but that's why like, if someone's [00:53:00] focusing on just that piece, they've gotta make it attractive enough that it's a product that it, someone's actually gonna do it, you know?

And it's okay, but. It's, it's not gonna be very fun to see, oh yeah, pay $5 a month for verification. Id like, one's gonna do that. 

Nate McBride: Alright, well then the question is 

Trance Bot: how do you get, it should be open 

Nate McBride: source, it should be. But how do you get to answering the question of if it can be continuous or point in time?

If I trust Mike once, is that it? Do I ever have to reverify him? And if there's some sort of cryptographic provenance, I can't do that necessarily. So the reverification has to start almost completely anew unless you have a, another idea, like, I can't think about, you know, you have the point time verification.

Mike logs in, he's got trust for the session, and however, after either after some elapsed time [00:54:00] or next login or something, he's gotta log in again. Uh, simple going for 40 years, 

Mike Crispin: but I think it's, uh, a rolling, a rolling cryptographic key that's changing every day and lines up with a, you know, with a, an algorithm that is being used to match those up.

So just like MFA, you know, just, uh, something that rolls is continuously changing that only the person who has access has access to it. And, but the difference is that. If, if someone, a human needs to verify it, they need to be able to see the key as well. So you can trust the system to do that today. But in the, in the new world, it might be, Hey, we need a shared key.

We need a shared verifier. And that's does the 

Nate McBride: chief trust officer that would do that?

Mike Crispin: Yeah. 

Nate McBride: Yeah, exactly. [00:55:00] Um, well, but, so I was just on, on that. I just, that thought is that, um, remember we talked about the 1% and how most of your data is crap and only 1% really matters. And if you did your, if you truly did your security properly, you would do really, really hardened security on the 1% of data you care about and just decent security or good security on the rest.

Uh, this was last season, we talked about this and I, I bring that point up again because what if you can separate out verification, uh, I don't wanna say frequency, but verification, reverify, reverification, uh, needs based, based on the activity. So Mike has been verified. He needs to access X, and if he does X, he needs to reverify himself, uh, every 30 minutes.

If [00:56:00] Mike is doing y, he needs to reverify himself every six months. And there has to be a delineation there. It's not so simple as like having some sort of structured ontology of their data, or maybe it is at least technology for that isn't there yet. But it could be done probably at a platform level.

Yeah. I wouldn't say, um, inside for any unstructured data could, could that be possible? But um, yeah, every single, so often I have to reverify Mike, which means Mike has to take off his glasses and stand in the biometric scanner again, say his voice, show his ID every hour to continue to use this platform.

Um, maybe that's, maybe that's a possibility. 

Mike Crispin: Have you, have you heard the term zero knowledge proofs before? 

Nate McBride: No. 

Mike Crispin: It's like, uh, you know, when I was looking up this with the, the, the [00:57:00] app that I'm working on, it came up and it's, it's this mindset of, or this, not mindset, but this, this, this theory that, or capability that we could use.

It's a mechanism which someone could prove something about themselves without actually, without actually saying, saying what the thing is. Proving they were over 18, but not revealing their age. Right. Or proving, um, what was it called again? Zero What? Uh, zero proof. Uh, zero knowledge proof. Like, kind of like, um, you know, like, uh, zk 

Nate McBride: pre, a pre verified verification that you can bring with you 

Mike Crispin: Yes.

You know, to, to prove a specific fact about yourself, you know, without revealing the actual [00:58:00] date, the actual name, but to be able to verify it. These are supposedly the software that's that, that people are working on to think about how to do this. Wow. Um, and that I never heard of that 

Nate McBride: before. It's um, it sounds though like a lot of ways, like some, it's a chicken or the egg scenario, somewhere along the lines.

You have to verify yourself to somebody. Someone, it's same thing with, same thing with, with a sort of defi model. Somewhere along the lines. Somebody somewhere has to know who you are. You can't go from start to finish without the beginning person knowing who you are. Like, there's no such thing as, I think true defi because you have to put money in and take money out.

Yep. It has to go to somewhere. It's a fiat event. Um, the same thing happens with, with this idea, but I wrote it down. I'm gonna invest. 

Mike Crispin: I, I'm curious, I'm really curious about this because it seems like the, the, the, [00:59:00] the problem that we're, the paradox piece we're trying to solve is, is related to this area, this ZZZ kps.

But it could be, 

Nate McBride: yeah, it could be just theoretical at this point. I don't know. Yeah. I mean, I'll see if there's any research papers in archive.org later, but if it's simply, what if it was a possibility? Uh, nothing can be spontaneously generated. So it would have to come from a source to prove that you are in fact, say 21.

Um, now again, maybe one time, and this entitles you, the titles you with a certificate, which you can then carry forward. It says I'm over 21. Um, but somebody, somewhere, somebody knows no such thing as true. This 

Mike Crispin: is, this is a real, this is a real thing according to Gartner. I just looked. 

Nate McBride: Oh, well, I mean, shit, Mike, it must, 

Mike Crispin: it must be real Gardner.

I bet you I wouldn't said something funny. I just saw, they just saw Gardner says, I, it just popped up. I was just [01:00:00] searching online for it here. Just, uh, and, uh, yeah, so funny, the nexus of forces. It must be real. It must be Maybe, maybe they created this, uh, this graphic or whatever. Oh my God. Oh man. 

Nate McBride: That's funny.

Um, well, I have to get my Garner subscription renew then to find out. No, you don't. Or, or not. I mean, so, uh, let's see. Uh. Um, Okta has adaptive MFA. Yeah. Okay. So that's what Okta is called. Adaptive. MFA is supposed to be this, this sort of continuous verification. Google has, uh, risk based re authentication.

Yep. 

Uh, and then Microsoft has continuous access evaluation. The branding team there, working, working overtime. Uh, so continuous access evaluation for Microsoft. Google has risk-based re [01:01:00] authentication and Okta has adaptive. MFA adaptive MFA has been around now for a couple years. Uh, I don't sure they've really still nailed it.

Remember OAuth, the, uh, auth, sorry, the auth zero purchase was supposed to be sort of the next gen adaptive MFA idea. I mean, they're all moving towards continuous verification, but I don't think anyone's really nailed it yet, but, okay. So 

Mike Crispin: that's, that's the, the, the World Coin World ID too. That's, that's 

Nate McBride: what they're doing.

Let's assume that world coin is the answer. Uh, what gets by world 

Mike Crispin: coin? It's, it's 68 

Nate McBride: cents. Go for it. What actually gets verified? The person, the behavior, or all three? 

Mike Crispin: It's an iris scan, I think So it's identifying that you're No, I, I, uh, 

Nate McBride: all right. Fuck, world coin. Let's get back to, okay. Sorry. [01:02:00] So, in a verification event, so Mike has gone and proven that he's 21, he now has a verification thing, and he wants to verify to a thing.

What gets verified? Uh, just Mike, the Mike in his device, Mike in his device and where he is or what he's doing, or all, or some combination of those three.

I mean, there's pros and cons. Reach, right? The pros for, for you is that it's, you, you have chain of custody. Mike logged in. Mike did this. It's very clear, but it can be conned. It can be spoofed, it can be, uh, it doesn't, doesn't verify your device, et cetera. Uh, on the what side? The, the pros would be, you know, you can't, might, it's hard to spoof a device without physical access.

[01:03:00] Sometimes you can't really virtually spoof a device. Uh, you can verify the device health and there's less, it's less invasive. But the cons of courses that you can spoof Mac addresses. Mm-hmm. Uh, multiple users can use one device. Yep. BYOD makes this difficult. And then for the how, I mean I guess I would, you could say I'd probably type the same way all the time.

You know, every single sentence starts with fuck and generally ends in the same way. I don't use capital letters, I don't use punctuation, so it'd be pretty easy to see if it was me or not. Uh, but mouse movements, yeah. Uh, maybe not that. I think you'd have to get into what's been, been around for the last decade, you know, the geofencing and the, um, things of that nature.

You know, the anomalous behavior that occurs in a, on a much bigger pattern spread than just whether or not Nate's having [01:04:00] had too much coffee this morning or not enough. 

Mike Crispin: How important is, is, is context to verification? 

Nate McBride: Well, I think if we go back to the earlier question of, um, you know, extreme verification for this light, verification for that context is gonna be two way.

I shouldn't be able to access, uh, x the extreme stuff unless I have my machine. I verified it's me, I'm on this one particular vlan. And whatever, 

Mike Crispin: all those things need to go together, right? And that's why that's, that's kinda like your last ditch, that's your safety net, right? Yeah. Is the context, 

Nate McBride: like every sci-fi movie everywhere, where you can only get to the, that computer by being in the room at the certain time with the blah, blah, blah.

Uh, that is the context. That's two way. On the other side, lighter verification maybe. I don't care so much if Mike's in an airport or in [01:05:00] his office or at home or maybe I don't care so much. If, um, you know, today you're using your iPad versus your Mac laptop, it's mostly about the verification event at a low security level.

I think that's, in that case, it's more like a one-way context. Um, 'cause you're not accessing the crown jewels. But it, it, now we have a problem because you get into the agent part. So we kind of talked about this last time, but we kind of saved it till now, which is, okay, so now I've verified it's Mike, he's at home, he's using his Mac, and he can, he can access this stuff.

Uh, now all of a sudden, Mike's agent needs to access, Mike's agent needs to access the critical stuff. Yep. There's a scenario for you. Mike just needs the basics today. His agent needs the critical, but his agent would need to be not obviously local to his machine. It'd have to be somewhere else. [01:06:00] Oh my God, my brain.

Uh, hold on. Lemme think about this.

Mike Crispin: It's definitely, uh, there's a lot to think about here and I think that that is, it's, it's even less about like, work related stuff and it, it's just like in general, you know? Yeah. How, how we make, you know, we talk about deep fakes and we talk about just information and what's real and what isn't the verification components of all this stuff.

You've gotta have sort of mul a multi, multi-pronged approach here. You know, you've got, I I, it's just baffling in, in terms of thinking about how to actually get our arms around this. And I, I just don't hear it discussed [01:07:00] much. 

Nate McBride: Yeah. 

Mike Crispin: I And may, maybe I'm just overthinking it. I, but I just, it's one of those areas that, um, I don't think we've thought through or that I, I'm just not, maybe it's just not sexy enough and that's why we don't hear about it so much.

Nate McBride: Um, well. We're gonna have a problem, right? You can see where this is going. You're gonna have a problem when for, maybe not for simple verification, if I need to verify this agent to read my, my boss's email. Yeah. I'm the, I'm the, I, I'm, I'm, I have an agent 'cause I have too much email. I need, I need my, my little bot buddy to go read it all.

Uh, that's probably low verification, not too hard to do. Yep. Um, but it becomes immediately com complicated when that same agent [01:08:00] or maybe a different agent that also belongs to me, needs to do something far more significant. And we're probably ways away from that, but maybe we're not so far from that. And it will escalate really quickly.

So does that, does that assistant use my credentials? Do they, does assistant basically piggyback on me knowing that I have high access and uses that to get in? Does it have its own credentials, therefore you have no chain of custody. It's just gonna say Nate's agent or whatever the hell it's called, in my case, probably be some doofy name.

And how does it verify itself without me interacting? Like, how does it self verify? Uh, and again, I'm not gonna, I'm sure Tony ing has a solution for this and it's apocalyptic as hell, but I'm just asking you, 

Mike Crispin: I have a solution. 

Nate McBride: Okay. 

Mike Crispin: We ask everybody, we ask everybody [01:09:00] what their mother's maiden name is. We save that, and then we ask them what it is.

We have it written down somewhere and if they line up, we we're good. 

Nate McBride: What about, hold on. What about the last four of the social security number? 

Mike Crispin: That's a good one too. Yeah. No one will ever know that. 

Nate McBride: No. There's um, 'cause there's, there's, it's like infinite combinations, right? I mean, it's four digits and I wasn't, I didn't do math in college, but I think it's like 10 billion combinations.

Mike Crispin: Yeah. Going back to what yous talking about, you're talking about Thanksgiving and the safe word. Safe word. The safe word. The safe word is, is simple. And 

Nate McBride: by the way, we deployed it. Yep. That's, so 

Mike Crispin: that's, that's a good idea. 

Nate McBride: We played it at Alio. The safe word is now in use. What is it? [01:10:00] I'm just kidding. It's, uh, Domino's Extra Large, no onions.

Oh, I love it. Don't, don't it be funny. I'm not gonna do this. But it would be funny if we had, if I went on, like, I took the podcast tomorrow and published it, and then I just did a big long beep during that part of the episode. That would be funny. You'd be like, that crazy, that crazy asshole actually said it.

Um, we need to bleep it out. Yeah, we need to bleep it out. It'd be pretty funny. Uh, but we will have to address at some point in time this idea that an agent that I've made that's going to do work on my behalf because I'm overburdened with things and therefore it needs certain elevated credentials.

Either A, gets those credentials, b does not get those credentials 'cause it's an agent, [01:11:00] uh, or c some sort of hybrid. But if it gets the credentials to access the, the dirty laundry, how, when does it have to be verified? Same. And if it does, I don't, I don't know how that works. And then multiply that, multiply that by a thousand.

Mike Crispin: At at least, at least with a, um, let's go back to something you suggested last season, maybe even the first season. What if, what if the, you were talking about email addresses, right? Yeah. You were saying, okay, why do we use their first dot last name? Why, why do we, why do we do that? You know, that's gives away one of the identifies I would give away half the 

Nate McBride: keys.

Yeah. 

Mike Crispin: That's, we give her the identifiers. If, if we got to the point where. You got rid of the username and you moved towards pass keys, [01:12:00] right? Yep. Um, so you don't have passwords. The public identifier is the secret thing that only the company knows. 

Nate McBride: Yeah, the 

Mike Crispin: username, right? So the usernames are private to the internal company.

They're not shared externally. There's an email address, but the usernames are now like copy serve numbers and they're only known internally. And that's the public and shared secret Verify. 

Nate McBride: We, we, but we talked about this and why we can't, we talked specifically why we can't do this, we cannot do this.

'cause people aren't able to remember anything beyond certain character limits and they will write it down. Yeah. Uh, if it's not, if it's not their name. Like this is the, the biggest dilemma. 

Mike Crispin: So what if we had 

Nate McBride: the account for stupidity? 

Mike Crispin: What if we had, um, so to today we have the, the Okta Verify Worlds and other tools.

Yeah. The [01:13:00] Microsoft Authenticator that sends the push notification to the phone or you can type the code in. 

Nate McBride: Yeah. If 

Mike Crispin: you set the push notification to mandatory, so now you're getting a push notification. That's the only way it comes through. There's no number you put in, you just, you identify, you go, yep, it's me.

And then you've got another MFA that just a circle of numbers. So you put in the circle of numbers that's linked to you. When you log in, you get a push notification. After that, it's me, uh, which is, could be the passkey actually. So that's even easier. And now do they still have an email address? But it doesn't log you in anywhere, so you can't use the email address to log in any of your core systems.

It's just a, it's just there for correspondence for communication. But that way you've got the MFA that's circling the numbers is now what they log in, they actually have to type in, they get a push notification that allows them in by just pushing a button. Boop, I can do that. That's easy. [01:14:00] And then by, by design, the machine has a pass key on it.

So they're in, so their experience, all they need to do is look at the phone and type in the number that's on the phone and that's it. Everything else, boop, boop, boop, poop. It verifies them. And they, they never have a username that's actually worth anything from an access and identity perspective. Now, they do have an email address, and maybe that can still be billSmith@company.com, but it's not being used to authenticate, like in Okta, you've got, you can choose to not use email to authenticate, right?

You can just use the username. So it'd be interesting to think about, like, if that's a possibility, like, look, I know people are gonna be like, oh, you don't wanna look at my phone every time I have to log in? Well, they, they do it now, but, 

Nate McBride: uh, I feel like. Um, you're, you're dead on, but we're coming full circle back to something that's already occurred in technologically, uh, you know, sort [01:15:00] of, uh, milestone terms.

So imagine, uh, if we go back to a USB key. Yep. I mean, you, UV keys are quite popular and we use them everywhere as a single, as the only identifier. So I get my to, I get my ID and it's se it's like some long sequence, but I don't even, it doesn't even matter. Yeah. Because, uh, it's built into my, my, my USB token or whatever my token ends up being.

It's built into how I authenticate my phone. Could be, you know, NFC or something. Yep. That's okay. Fine. But now I'm, now I'm dependent on a device. Uh, I'm dependent on another thing I have to have, and I think this is where we start to go back in a circular way to one of the core problems of this is I'm not handing over my phone to a government agent or someone to verify myself.

Yep. So I'm [01:16:00] gonna hold onto it. So you have to now trust me that what I have in my hand is in fact my token and it's verified. But if I can, if I can swipe it maybe, or insert it or something, perhaps that will let me again, verify myself as who I am. Almost like, uh, again, I hate to go keep coming back to sci-fi, but the thumbprint, the, uh, mm-hmm.

You know, the thing at Javas Palace that comes outta the door and verifies you and says, you can come in, uh, this kind of like sci-fi shit is, uh, we're gonna, this is gonna have to be knowledge of you to get you in based on that thing, that verifying token thing. In order to make this work, there has to be a relationship established.

Um, or you have to at least be some, some threat assessment done of you at that, uh, a moment in time threat assessment. I don't know. Do they do just in time threat [01:17:00] assessments? I mean, Mike, you know, Mike is, does, does not, is not carrying a gun. Mike is, um, not raving and drooling at the mouth with fangs.

He's probably gonna be all right. Uh, but, sorry, we got way off track there because I, but that's, I don't feel like we answered the AI agent question. 

Mike Crispin: Yeah. 

Nate McBride: We didn't, I mean, maybe, maybe, maybe we have an episode coming, I forget which one, but it's about AI agents, but maybe this isn't the time to get into that, but ultimately verifying a thing that is not you.

Mm-hmm. But that is a representation of you, so your proxy to anything. How the hell would that be verified?

I, I mean what I, I, 

Mike Crispin: well, I think that's sort of the,[01:18:00] 

the, it's kinda like, are you a human, right? Is that what you're talking about? Like verifying that they're human or are you talking more about uh, like to how do we determine what to give the agent access to?

Nate McBride: Um, well, I, because it's sort of the, I think that's, that's more I wanna say. I don't wanna say that's simpler, but if I have an agent, I've made it and I'm the creator of it. Yeah. Then it, it can't be able to do anything more than I can do at the, the maximum it, it could do. If I made it, it's to do anything that I can do.

Yeah. I don't get some super secret sort of elevated privilege because I made an agent and I'm just Joe editor in the company. Now, if I make an agent and I'm Joe Editor, then my agent could do everything I can do and down. It doesn't get like super admin privileges 'cause I created it. So in that case it's gotta be 

Mike Crispin: accountable to the human is what you're saying?

Exactly. The agent has to be accountable to the human. Exactly. 

Nate McBride: And yeah. And so do I need my agent to [01:19:00] verify to me? I don't know. So 

Mike Crispin: once, 

Nate McBride: so in, in 

Mike Crispin: a way

verifying the agent also verifies the human in some ways. Yeah. Yeah. Right. Because you own the agent, 

Nate McBride: but you are, 

Mike Crispin: you are 

Nate McBride: accountable for the agent. If your agent is working some sort of quasi autonomous state, then the chain of custody says it's the agent doing the work. Not you, even if you've assigned the agent to do the work.

So back to our plumber example. Let's say the plumber comes into my house, he's an expert plumber. First thing he does, he gets in the house is he Googles how to fix my sink. And it's part of that Google answer. It says, call plumber. So he calls another plumber. Now I've only verified the first plumber. He has done verification of the second plumber.

Uh, he called his buddy plumber to come in and fix the problem and down the line. Next thing you know, I have a kitchen full of fucking plumbers, uh, all to get the things then resolved. I trusted the first guy and so [01:20:00] implicitly I trust every other resource he's brought in. In this model, I only trust the first guy.

So if I trust my agent, um. I don't know. Maybe, maybe we should, this is a big one. Maybe we should save this. I don't know, but 

Mike Crispin: yeah. 

Nate McBride: Or well, maybe we'll talk about it. Let's just figure it out. Yeah, because I think that, I think, I, I don't think, I don't think it's transitive. If I give my agent rights that those rights can't supersede mine.

That agent therefore can't change. That's right. Can't change my rights. It can only do That's right. Me, what I can do in below. That's right. The agent would have to somehow be, um, authorized in a database to show that I, in fact, created and own it that way. We had some kind of chain of custody, which is to say Agent X on behalf of Nate did this in the, in the, any kind of audit trail.

Uh, but we still won't have it. Or, or 

Mike Crispin: is it, or or is it you? I mean, I think, I [01:21:00] think to, in terms of how, how it's set up is that if you, just like, if you wrote an application, let's say you built a SharePoint application at your, it's your, it's your company now. You're a user. You've been given certain rights and access to a system.

Yeah. You can't go above and beyond those rights. So when you build that system, you have access to the data you have access to, and you can go no further. The same thing would be true if you go to AI studio and Google and you build an agent. You can't access data. You can't have that a, that data do something on your behalf that you couldn't already do.

I mean, they may, it might be more knowledgeable in some area to do something that, that you don't know how to do, and that's why you have the agent. It can't access or impact any data differently and therefore you are responsible for the agent, just like you would be responsible for any code or program or tool or automation that you've built.

So in that respect, if the agent goes off the reservation and [01:22:00] breaks, you are responsible as a hu the human is responsible. That would be my, my thought. Well, 

Nate McBride: okay, so I'm responsible. Big deal. What does that even mean? I'm responsible, like responsible as if I did it. You 

Mike Crispin: Yes. As if you did it. 

Nate McBride: Wow. That's heavy.

Mike Crispin: Yeah. As if you did it. So, I mean, just like if you create a document with AI right now, is it the AI's fault that the document's bad? No, it's your fault. 'cause you used it and you put it in there and you didn't verify it or whatever. It's, you know, at the end of the day, the, the human still owns the AI until these things are running rogue and they're not managed by a, a human or a person.

Nate McBride: Yeah. So 

Mike Crispin: I think up to that extent, the, the verification of the human and the information is that the human is accountable. And the question back to the kinda what we talked when we first started, or maybe even before we talk about the eternal optimism thing, is that if you're able to deploy, you know, 40 agents that can do what you can do at your [01:23:00] access level and you're able to build them efficiently, now you're perhaps able to do 15 x what you were able to do before or.

And if you can't do that, you won't have a job. Right. So it's kind of like the amount of the, of if you can develop yourself with these skills and people inevitably will with YouTube and everything else out there, um, you'll need to switch. Yeah. I I think that still the eight es, especially these tools, er, the cloud codes and the Google tools that are emerging, all these things, you, you, we've got a lot of people that are, you know, in school or not even in school yet who are building this stuff already, and they're gonna come to the workforce.

And it's more about just building the building sort the, the perimeter in which they can work. Um, yeah. Is is the important thing. But I, I think that that's, I think you're gonna, I think the person's gonna be accountable for the agents. [01:24:00] 

Nate McBride: That's a mother, that's a mothership model right there. I mean, and writ large.

Right. Um, I mean, it's a true story and we need a way that when somebody comes out of, you know, some program and they've got an agent, they made how to verify that they built it appropriately. Yeah. Like it's gotta go, it's gotta go through SDLC review before you can come in the gates or something. I don't know.

And, uh, who's, who do you have, I mean, your chief trust officer has to be able to review every single line of code in that to make sure that there's not, um, subversively. I think, 

Mike Crispin: yeah, I think, I think the. That's true. But I think the reality is that more, more likely than not, um, it'll be reviewed after it's in production.

Nate McBride: Well, the reality is, in addition to that reality is there'll be a whole new industry spawn to qc, inbound agent code. 

Mike Crispin: Perhaps those are verifiers. Right? It's part of [01:25:00] the verification economy. 

Nate McBride: Yes. I'm gonna, I'm gonna be a chief trust officer. Fuck this. I'm changing my resume. 

Mike Crispin: You should be, yeah. Trust chief verification.

Nate McBride: I'm gonna walk around, be like, I don't trust you. You're out. 

Mike Crispin: I think we call 'em verification vigilantes. '

Nate McBride: cause they got, 

Mike Crispin: because they gotta be, take it on themselves to verify or it's not gonna get done right. So they gotta go outside the mold, do it on their own, make sure it happens. 

Nate McBride: Because, uh, so like, like, like, like Marshalls in the, in the wild west, like Yes.

Like these guyses. Yeah. By the way, if you thought, well, if you thought two weeks ago his notes were funny. We usually see my notes from this week. Um, verification vigilantes, verification vigilantes. I, and I wrote V two and, and, uh, parentheses, the V twos, man, you see those guys coming. They're all black, you know, black Stetson, uh, they [01:26:00] got, uh, they just got like four days of beard growth 'cause they've been sleeping out in the prairie.

Um, they smell like a cow's butthole, uh, because they just living in a buffalo skin to come in and be like, Hey, uh, we noticed you have too many agents. Um, then we, so we were talking about the, uh, I forget it was last episode, the episode before, but you know, this whole, the Zoom verification, uh, and maybe, maybe we only talked about at the bar that time, but we are having this problem a lot now where we're kicking out bots left and right.

Um, and we all know that Soar too, and the others can now do perfect DeepFakes for Zoom. But how do you verify how the people on a Zoom link are who they claim? Because, you know, you get the encrypted link, it goes out, it's a one time link, no big deal. Um, then people show up on the Zoom and they're like, I [01:27:00] don't even know how that bot got there.

What's that bot? I never turned on the bots before. And they. He's gonna spend 10 minutes troubleshooting with that bots there. But then you have those bots that nobody knows how to turn off. You had to kill the meeting. Like, I'm very sorry. We can't have until you figure how to turn the bot off. We just had this the other day, um, with a board member and you 

Mike Crispin: just to be like, you can't do that.

Shut it off. Shut it down. 

Nate McBride: We, we spent 10 minutes troubleshooting the bot. That's awesome. 'cause, 'cause Zoom, you know, and their infinite shitty wisdom decided to just turn on more features that nobody wants. Um, 

Mike Crispin: yeah, they went to V three right? Or something. Yeah, they just upgraded it. Yeah. 

Nate McBride: Uh, so you have to, you have that and then there's like the document authorship.

So when we use box and if you use SharePoint and others like that, you can get some level of, this version was last edited by this person and that's great. But the original, original of the thing, a document, what I love is when a document shows up [01:28:00] in box on V one and the whole thing is already done.

Uh, version one. No, it's perfect. Uh, I wrote all this in one, in like one, like I typed so fast it couldn't save it. All this is drops in as a version one. Okay. Where, where did that originally originate from? No, no, no. I really, I just, I worked on it. Inbox. No, you didn't. Um, and then, so we have an SDLC committee now in place to verify developer ship.

And, you know, I think using gi, I mean, I trust every single person on that team and they're all using GI and it's all legitimate. But if you had like a really big fancy company and you had tons of developers, how do you verify any developer or any code anymore? Um, it's, uh, I mean we, we, we rely on SPF and Decem for email sending, uh, as our only way to verify.

There's no other way to verify that Mike sent me that email, except for these two shitty little [01:29:00] archaic models. They're not shitty, but I mean, it's old. And as we've, as we've been proven, as it's been proven so many times in the last year, unreliable, even with SPF and D came in place, we still get emails from employees sent to themselves from foreign countries.

Um, wild. I know we just, we have to get everybody on Slack and stop using email. And then remember we talked also about, um, this scenario where you, you hire somebody. So, you know, Mike hires his new VP of DP and they have to go through 2, 2, 2 weeks of verification. Okay. Okay. Yep. Not getting any work done, but Mike's got another VP and that VP already has access to those sources.

So that VP just invites the VP of DP to the, to that source, [01:30:00] uh, knowing they're gonna get it anyway ahead of time. What's 

Mike Crispin: a VP of DP 

Nate McBride: dig? Digital Protuberances. 

Mike Crispin: Okay, got it.

I was hoping, I was hoping you were saying what you were thinking. What I was thinking. Okay, good. That's, that's what I was thinking. Exactly. 

Nate McBride: That's what you're thinking. Yeah. Yeah. What else were, what were, what 

Mike Crispin: were you 

Nate McBride: thinking? What were you thinking? Yeah, 

Mike Crispin: that was exactly it. I can't repeat what you just said, but that's what I was thinking.

Exactly. 

Nate McBride: So, well, anyway, who gets in trouble? The VP who already had access that invited the VP of dp or the VP of dp.

Mike Crispin: Uh, I know how he can reward the VP of DP if he did anything right?[01:31:00] 

No, I, uh, 

Nate McBride: did you. 

Mike Crispin: I've forgotten the question completely. I don't even know what we're talking about now.

Nate McBride: Oh man. You did not see that one coming. 

Mike Crispin: Nope. I was, I was, I was. That, that brought back some memories. 

Nate McBride: Uh, um, the question was, yeah, please repeat. Thank you. If somebody already has access to a thing Yep. Somebody does not yet have access to that thing, but they will be getting it. And the person who does invites the person who doesn't, uh, even though that person who doesn't, doesn't have it yet, is how do you, like, that's something you can't verify.

It's an unverifiable event. I mean, ostensibly [01:32:00] you verified the person who's going to get access will get it. They're still going through like their break in period. Sure. You know, not yet. Full dp, just the VP and Got it. They, they're like a week away, but they can't wait anymore. Yeah. They gotta get into the dp.

So the other person gives them access. Uh, so they, you, you skipped a verification event unless you have sort of safeguards to prevent that. Unlikely in an unstructured data platform. Um, and so. What happens there. But, uh, I guess more importantly, the question is do you have, what, what do you do for verification events that are missed?

Um, how do you reconcile those and they will happen? I mean, that was just one example, but I can think of a ton of examples where verification events can be skipped and therefore Sure. If you're, if you're relying on verification, you kind of have a problem every time one is [01:33:00] missed. Um, anyway, I guess that was more of a thought thought than a communication point.

Mike Crispin: Yeah. No. Nope. Yeah. I, I'm, uh, trying to get past of verification versus authentication in that respect. Well, authentication, getting through the door and verification is, yeah, I, I hear what you're saying. 

Nate McBride: So, uh, sci-fi for a moment, imagine if, um, it's 20, almost 2026. Imagine if by 2030 you have to verify who you are multiple times a day to do work 

Yep.

And 

potentially to an AI agent. 

Mike Crispin: Yep. 

Nate McBride: Where there's a possibility that your verification could go wrong. 

Mike Crispin: Sure. 

Nate McBride: Um, would you accept that? 

Mike Crispin: I think if [01:34:00] it's easy enough and it's automatic, I wouldn't have as big of a issue with that. But, um,

Nate McBride: I mean, is verification and or just an arms raised with no winner at this point? I mean, it's like we just have to keep upping the case and verification until it's so bad we can't verify anybody anymore. 

Mike Crispin: Yeah. Is it kinda like privacy? Yeah. Is it gotta get that point where people, like what's the point of verifying anything?

Nate McBride: Right. I, well, I think, um, does everyone just give up and go to eat protein shakes and wash? Get on the 

Mike Crispin: spaceship? 

Nate McBride: Get on 

Mike Crispin: the spaceship? Yeah. Get on the, get on the spaceship and, and, uh, sit in the chair and watch TV and wait for someone to come with a little plant in a boot. And then we can go back and start over.

So 

Nate McBride: when we, [01:35:00] well, what was the episode we did on, on identity? That was episode two, right? Yeah, episode two. Yeah. Talked about identity. What is identity? Yep. Um,

if identity is the claim. You said verification is the proof. 

That's right. 

So I, I claim to be Nate and then verification proofs that I'm Nate. So without ver, without verification, my identity is basically just. Assertion. I'm Nate. Yeah. Trust me. Trust me. Yep. So then it has to be some combination. It would have to be some combination of who you are.

I'm Nate, and then something about me that I have that nobody else can have. Uh, some kind of credential that I have to have that you couldn't possibly have. And then, and then something about me that I know. [01:36:00] And the two, those two should, those two should be separate. Like my credentials. Maybe it's a physical token or a number or something.

That's something I, that's a credential I have. There's a thing I know. I know a password, I know the safe word. I know, you know, my last four, my social, like you would need, I think all of these for just the identity part. I mean just the identity part alone sure requires that. But then you get into proving all of those.

I have to prove it's me. I have to prove that I have a credential and I have to prove that I have knowledge of the secret phrase. Well, I can probably do the last two, but I'm still back to. The identity part. I think we did a great job of talking about proving identity in that episode, but we, yeah, [01:37:00] I don't think we, I don't think we, um,

I mean in that episode we were looking at it from sort of a inside out perspective, you know, Nate investigating the gates. So if we think about, uh, I'm trying to think of the way to put this. If we think about proof, can we distill proof down to the same, same idea? And I think, I think ultimately like what is the, what is the best proof to, 

Mike Crispin: yeah.

Nate McBride: Like, forget agents for a moment. Forget what, forget agents all that shit for a moment. Like what's the best possible proof,

Mike Crispin: aliveness, I guess

Nate McBride: Like physical. Physical, yeah. Well, lemme rephrase it. [01:38:00] Best reasonable type of proof 

Mike Crispin: is, is sitting right, right next to you in the room. 

Nate McBride: Yeah. 

Mike Crispin: Yeah.

I think there's a, 

Nate McBride: it makes it interesting. You. Well, what would have to happen is you'd have to have basically a distributed network of, of trust, people trust, uh, buoys maybe. So let's say I have a distributed company and we have offices in San Francisco, Boston, and London. Yeah. Those, the people from three, the three trust buoys from those three sites would have to come together on some frequent basis to reverify each other physically.

I know you, you know me, we know each other. Great. Then we go back to our respective locations. And then we're, we're effectively the trust puy for our own central place. We can only verify people in our place. I cannot be a trust buoy to either of the other two [01:39:00] locations. That would be, I think, a reasonable, probably even efficient, because I could do other things too.

Like I could make tons of AI swap all day. Um, it'd be a reasonable and efficient way to do a trust verification. 

Mike Crispin: So now does everyone have to be back in the office 

Nate McBride: at least once? Yeah. If I'm gonna do reverification of you all the time, then I need to see you more frequently. But not, not every day. But I do need to verify you physically.

There's no way to do it without it, and you just can't, you can't have the job. That's right. You get on, get on a plane, come to the office, let me meet, meet with you, and talk with you for a couple hours so I can know. You tell me a joke, tell me something funny, and let's establish a safe word. Now you can go back to your place.

I verified you. Yep. I, I, I've, no, I'm sorry. I've proven your verification now. I haven't enabled [01:40:00] you to start using your identity, so I've proven you. Now you can start identifying yourself. That I think would be a way to do it. I don't see anybody to do it, do you?

Mike Crispin: I have some thoughts, but I think, um, I mean, I think there's, I think there's gotta be a way to capture those connections and those, those, those verifications. And I, I think it, I think it has to be open. I think it has to be an open source, uh, mechanism in which, when you, when you prove someone is human.

Nate McBride: Yeah. 

Mike Crispin: That with that comes a ledger of who they are, what they're focused on, you know, what profession they're in or what industry they're in. And I, I just had the coolest idea. Check that, check out 

Nate McBride: this idea. [01:41:00] I literal, I, this is a fucking movie I just invented. I just made a movie and a book in one idea.

It is, uh, it's 2090. Okay. Uh, it fades into like a desert town, town, you know, used to be a city, it's been destroyed. Uh, there's, there's an old guy. He's sitting at this old player piano in a bar, drinking like really bad tequila. And another guy walks into the bar and he walks up to him and says, I need to be verified.

And the old guy turns around and is like, go away. Fuck you. I ain't verifying anybody today. I, my days are done behind me. And the guy's like, you know, this is the sort of the foreshadow of the whole movie. He is like, but I need it. I gotta go hunt a man. So the old guy, and he'll play a game of cards and the guy wins.

And so he is like, okay. And then the guy who wins, who gets his [01:42:00] verification says, you're the last, you're the last verification buoy, aren't you in the world? You're the last one left. The last one left. And the old man's like, that's me. I'm the last one left. And then the old man turns like the young, the, the guy who, he's young by the way.

He goes to leave the bar, he turns around and says, Hey man, hey old man, when I finish this job, I'm gonna come back. I'm gonna be your padawan. Can't use that. You say, I'm gonna be your can't. I'm gonna be a trust. I want you to trust me the way, teach me the ways of the trust buoy. So the kid goes out and does his task.

He's emotionally scarred, right? It ends up, turns out he ends up killing his mom. Uh, but he doesn't know it was her because she was a ghost. Anyway, it's a long, it's a whole other that ends up becoming the prequel. Uh, can't go into that. But, um, because he had a twin sister and they lived on end indoor and all this other stuff.

I can't go into it, but he goes back to the old man. The [01:43:00] old man teaches him though, how to be a trust. A trust Buie verification buoy. No, I'm going with trust Buie. Trust Buie. I like that. Trust Buie teaches him how to do it. And then they, they have like a, there's like nine other movies that come out afterwards, a whole, whole series about how they travel the universe, uh, and everyone, time people need verifications to do things.

Some, some not so good, some so good. And then they, they, and they teach more people. Then like in, in, in episode four, they have an underground colony where they discover the mandalorians who have, are all trust buoys. And they reunite with their God. Dude, I am telling you, this is fucking gold. Write it 

Mike Crispin: up, then write it up.

Nate McBride: Yep. 

Mike Crispin: So write that script. I'm thinking like, I'm 

Nate McBride: thinking like the main character's name would be like Mad Max or something, but I'm not sure if that's been used, but something like that, right? You get the idea. It's [01:44:00] postoc, post-apocalyptic world, but people seem, need to be verified. 'cause the, the AI agents are running the, the, the planet.

Mike Crispin: Those verification vigilantes, man. They could, 

Nate McBride: yeah. Yeah, exactly. Yeah. At once, at one time a scorned people and then at the end of the world it became the only link back to civilization. 'cause there's the only ones that can prove who you are. Yeah. Fucking hell dude. Right there. There it is. Book, movie series, album options.

Christmas album. Fucking, uh, Netflix. Special. Start writing the script. Let's do it. I just wrote it. I'm just using the transcript. Taking the transcript. 

Mike Crispin: I'm just gonna read, read it into Claude and see what it comes up with. 

Nate McBride: Yep. I just did that. That was fucking genius. What are we gonna, I'm gonna give a name.

[01:45:00] Let's see, what's somebody call the, the trust buoy? No, that's not good. The trust agent, um, the agent. Ooh, that's powerful. And like, we gotta figure out like the love interest. You know, he's, he's gotta lock it around his chest and he opens it up all the time and it shows his long last love who he killed accidentally when he, he, he hit the delete key and it deleted her identity by accident.

Yep. He was doing like a, a tape to tape backup and, and he didn't read the log file and deleted, deleted her account and went to restore it and it couldn't restore her. She died. Oh, fucking tape backups. Exactly. And by the way, like in this movie, all old technology comes back [01:46:00] because the AI agents want to make everyone so dumb again that they can only use Windows 95.

Mike Crispin: That's a great idea. And that backup 

Nate McBride: back up and you can only, all computers are connected by token ring networks and you get one laser, one, one Apple laser printer too. Laser writer too. 

Mike Crispin: Oh man. So that's awesome. 

Nate McBride: Well, that's 

Mike Crispin: awesome. I'm literally, 

Nate McBride: I'm literally gonna make this in just, I already got the perfect, I gus Van Sand, we'll direct it.

I got the whole like wide screen imax, the whole thing, all set to go. But while I'm doing that, uh. How many years do you think it will take before you and I as heads of it feel the verification pressure, the proof pressure?[01:47:00] 

Mike Crispin: This'll be couple years. I think I, I think it's, it's a ways out because I think we'll go through a, oh shit, this stuff needs to be verified. And people are very, I think everyone thinks they're behind on ai and I, the truth is that nobody's really ahead. Um, but so, so what I'm saying is that once people get a taste and they're, they're seeing value from ai, realize they're gonna push harder and then we're gonna have a few bottoms fall out and verification will get important.

Nate McBride: But why don't we do this? Lemme, why don't we do this already? Lemme ask you that question. 

Mike Crispin: Why don't we do what, oh, well, why don't, why don't we prove 

Nate McBride: already? 

Mike Crispin: I think there's, at least from an authentication and identity perspective, there's a number of tools in which this is a cybersecurity concern in which verification of people is already being done.

And. [01:48:00] Desk and service desk systems and the like already, but verification of data is already happening from a GXP perspective and the data integrity perspective, I think that's a big part of our industry. Right. So that's already happening. 

Nate McBride: That's verification of, it's a, that's a verification of identity though.

Mike Crispin: No, no verification of, of, of, of data and validation of systems is a verification mechanism. Right. And data integrity is traceability. Right. So that's important and that's a ver form of verification. So I feel like we, it's, it's, we are already doing some of this stuff, but we also don't have today at least it, it's not as common.

It's becoming more common. We don't have something creating our work product that we're not sure how it does it, but it does it, we think it looks good. Yeah. Okay. Let's go and look. It, we just saved six [01:49:00] months because we, we used this agent to put this thing together and someone else looks at it and goes, this is great.

This is exactly what we need and everyone's successful and we, we make more money with less and everything's great. And then another year goes by and there's a few big stories about data that wasn't truly verified or outcomes or products that weren't truly verified. That were created. Yeah. And there's a precedent set that, boy, we need to have a much stronger verification.

I think that's, I mean, you know, doing some of this research for the last episode and into this one, this is already brewing. There's a lot of concern about this. Is it rightfully so that there aren't a lot of people sitting around talking about this. You know, there, it's not the sexiest thing in the world.

Well, there's the tech accord. Yeah, the tech accord, right? Yeah. That's fucking, but, but that's my point. It's my point is like, [01:50:00] it's right now the, the, the, the sex factor, sexy factor is all these great things that, well Mike do, this 

Nate McBride: is, this is a PG 13 podcast, man. 

Mike Crispin: I'm sorry about that. But yes, is, is is all the great things and the bells and whistles and the speed we can move at.

And that's, and oh, can we reduce, you know, the amount of money we're spending or people we need, and I think we'll go through this. Everyone thought we're kinda at the height, the height of the hype. I think we're kind of on our way back down like this. Like, yeah, this is crazy. We should really figure this out.

But quickly, as these new models get released and it's part of our culture now in the wor in the world, it's gonna spike back up again and someone's going to have a two person company that's worth a trillion dollars and it was go, how the hell do we do that? Right? And then it'll be, it'll be that, there'll be a major fall bottom will fall out of one of these companies and there'll be a.

[01:51:00] Catastrophe that happens because of it. And we'll find somewhere in the middle where there'll be a big verification economy that needs to verify all this great work that we're doing. But it's still, because now we're moving at five x the speed by adding one x or two x verification, we're still moving faster.

It's just the story of regulation really is, Hey, we do this, make this massive leap forward. And it's such an exponential leap that if we add a little bit of regulation or even half the regulation to slow it down half the speed, it's still a huge leap over where we were. So it's, that's I think, where it might end up.

But I still think we're two years away from being tapped on the shows. Like, okay, how do we verify humans? I, I don't know how much that will be an issue. As much as it is if the agent model and the, the, the, the programming system is mainly the, the [01:52:00] application developers, the coders hold the bag, you know, in terms of the development of these agents.

It's until the agents are writing other agents. And that's when it gets to the point where nobody knows how it works. The agent wrote this agent, so I don't know how it does it, it just does it really well. And how do you keep your arms around that? That's the part where it starts to get interesting is when.

The, like, the stuff that philanthropic is saying right now and is warning us about is just like the whole, this, the, the, these things are blackmailing people because they don't wanna be shut off. Right? And once they start behavior be behaving and doing things they can't explain, that's when verification becomes even more, more important.

And, uh, how do you verify something you don't understand? We've just set up all these little building blocks of proofs in which [01:53:00] this is what we know, this is what we know to be true, and now we need to compare it against this output. And it's all reactive because of whatever is created will already be out there, will be already created.

Nate McBride: It. It's, it's non terminable. I mean, if you say building blocks are proofs, it just becomes a, uh, I wouldn't say infinite cascade, but they're cys. 

Mike Crispin: They're 

Nate McBride: cya. 

Mike Crispin: It's a asking for forgiveness model. It's they're cya. Oh, and now, now this is in the assumption that if, again, wearing the, the, the rose colored glasses saying that these agents are really gonna do things really well and there's less likelihood of human error and things are working great, then the more that that happens, the less and less people will get nervous.

And not trust the calculator and not trust the [01:54:00] computer. Right. And they're just gonna trust the ai. 

Nate McBride: Yeah. 

Mike Crispin: Because it's working really well and they've set, they've seen the proof that it works. So let's just let it do what it needs to do. It does it better than we do. And that's when you start relinquishing the control.

And it's, that's when it gets scary, right? Is we don't know how it works. We think it's doing the right thing. We don't see any pain, we don't see any issues. And, uh,

we've got, uh, pluribus conversation. Right. So that, that it's good. It's interesting to talk about, but it's also scary. And it could, but it could also be great. Who knows? We just don't know. 

Nate McBride: You're such an optimist, Mike. Geez. It makes me Tucker. 

Mike Crispin: I know, I know, I know. I, I, [01:55:00] I think there's, uh, with every technological revolution, there's bad shit that happens and there's good shit that happens.

And this one is probably the most consequential of all because it's the first thing that could replace us. So yeah, we take that, the way we take it, which is, oh shit. Or oh, maybe I can. Maybe we can do the things to help the planet that we couldn't do before. Maybe there are things that we can do to build better relationships and have better quality of life.

Maybe not, who knows, maybe the, you know, universal high income or, uh, or just still keeping people rewarding to be doing rewarding things like being able to run the marathon and have the time to train for a marathon. Being able to see the world, you know, things [01:56:00] still an accomplishment to do a, you know, a hundred mile hike.

People still feel rewarded by, you know, doing certain physical activity right now. They just don't have the time or raising a family or any of these other things that are very important that people get sidetracked by work and pressure and anxiety. So do you even have an 

Nate McBride: agent run marathons for me? No.

No, no, no. Why would you do that? Okay. I'm just kidding. Oh my God. No. I had, I, I hosted a 50 K on Saturday down in Rhode Island. 

Mike Crispin: That's awesome. 

Nate McBride: That's awesome. Left, left nut 50 and um, 33 runners showed up. 33 runners got at least 50 k and they all get shirts or hoodies depending on what they ordered. Um, it was fucking awesome race.

It was 20 Degrees. Degrees, what was it called again? Left Nut 50.

Mike Crispin: Okay, I like that. 

Nate McBride: Um, and the right nut 50 is going to be in, I think, [01:57:00] late April. Um, also 50 K or 50 50 mile year pick. It was awesome. Such good, such a good race and it was freezing cold.

Wow. 

Mike Crispin: Well on that note. 

Nate McBride: Hundred percent penetrate. Love it. Anyway, so, so by the way, we have, um,

I'm just looking to see what's coming up next week or when we next meet.

Um,

let's see.

Do, do, do.

Ooh, you know what we get, we get to talk about shadow AI in episode [01:58:00] 11. That's gonna be awesome. And we can revisit Oh yeah. Revisit the prominence discussion. Then we talk about, um, the IT evolution, next skills and episode 36, we come back to the death of password in episode 39,

but next week is. Who speaks for it. It autonomy, media influence without approval and being a value generator versus a order taker. That is where, when, when we, when we do the, the hero's backstory for the trust agent or the agent, the trust the agent was originally, um, a value generator in it before becoming an order taker.

Yep. Before getting promoted to becoming a trust due to some secret dark skills. That's the genesis story of the whole thing. [01:59:00] You're not buying it, are you? Will you see the, will you see the movie when it comes out? 

Mike Crispin: I, I, I'm, I'm in for the movie with the buoys. I want, I do You doing 

Nate McBride: WWEI to sponsor a movie night for us?

Yeah. All right. You're not doing the buoys. I like buoys. I think it's a good concept. 

Mike Crispin: I think I, no, I said I dig the bu I think we need to use the word more. I think that's a uhy. 

Nate McBride: Yeah. 

We should use buoys. It's 

Mike Crispin: not patented. Yeah. I like the trust buoy. The verification buoy. Vera buoy. 

Nate McBride: Vera buoy. Well that reminds me, hold on.

While, since we mentioned it, let me just head back over here and, uh, trust buoy.com.

We gotta think on this. I'm telling you, it's a movie. You do the whole soundtrack. It'll be awesome. So I'm looking at the [02:00:00] calendar, Mike. Uh,

next week is 24th. The following week is the 31st New Year's Eve. So we have two Eves back to back. So we next acting on the seventh. 

Yep. 

Sweet. Well, all good. Thanks, you and the people that you copied and pasted into this picture here. Have a great, have a great holiday. You as well, man. We'll talk, we'll talk over the holiday.

We, are you going away anywhere? Uh, just, you know, to and from Maine here and there. We should get together. Yes. We should definitely get together. Are you around next week? 

Mike Crispin: I am around next week. 

Nate McBride: Okay. What about the, what about the week up? What about the week between? I have to come back and do some work with, uh, our network switches, uh, the week after Christmas.

Are you around? 

Mike Crispin: Uh, I'm [02:01:00] around. I'm 

Nate McBride: around. All right. Cool. Cool. We'll do, I'll, we'll do dinner. That works for me, man. All right. If you're listening to this podcast, then give us all the stars. Be nice to everybody. It is the season just to be nice. Don't be a dick. Uh, have your pet spa or neutered.

Uh, do calculus of calculus of it podcast, the co oit us. Also the calculus of it.substack.com. All our merchandise is there. It's flying off the shelves. You need a last minute gift for a loved one or somebody you hate. Even buy 'em a, a ai f shirt. Blow their minds. Yep. Anything else, Mike? They would 

Mike Crispin: love that.

Nate McBride: They would. Yeah. 

Mike Crispin: You got, you gotta get some t-shirts. You got to get it out there. This, these are kickass shirts. I, I've seen them out about, I've seen stickers out there as well in, in bathrooms everywhere.

Nate McBride: That's right. If you, if you, if [02:02:00] you take a picture or a video of yourself putting an A IAF sticker in a Boston bathroom and send it to us, we will get you a custom, one of a kind a IAF shirt made just for you. A custom one. That's right. Custom. All right. Well, we, I think we answered some questions. We probably as tip as usual offered more than we answered.

Uh oh yeah, 

Mike Crispin: that's, that's, that's always how it goes. That's okay. And that's okay. 

Nate McBride: The verification, vigilantes mothership, uh, and zero knowledge proofs. These are all future topics we're gonna have to come back to. Yes, yes. It's gonna be awesome. All right, boy, Fidel man. Have a good night. Bowie. Good later.

Mike Crispin: Take care, man. Bye.[02:03:00] 

Trance Bot: The calculus of it,

season three,

verifying this identity.

Sometimes you just have to take it.

Sometimes you just have to take it

because it's season three divided autonomy,

verifying 

identity.

The calculus of [02:04:00] it.