The Calculus of IT

Calculus of IT - Season 2 Episode 6.1 - Exploring the New Digital Wave of Rationalization

Nathan McBride & Michael Crispin Season 2 Episode 7

So much research went into preparing for Season 2 and there was no practical way to really call out all of the best articles and papers we read.  There are a few that really stand out however, and we wanted to highlight one of those tonight in a special Calculus of IT Jort. 

In this Jort, Mike and I took some time to discuss an article by Lambèr Royakkers (from the School of Innovation Sciences, Eindhoven University of Technology, The Netherlands) and Rinie van Est (from the Rathenau Instituut, The Netherlands & Eindhoven University of Technology, The Netherlands).  The article is entitled: "The New Digital Wave of Rationalization: A Loss of Autonomy".  From the abstract:

The new wave of digitization and the ensuing cybernetic loop lead to the fact that biological, social, and cognitive processes can be understood in terms of information processes and systems, and thus digitally programmed and controlled. Digital control offers society and the individuals in that society a multitude of opportunities, but also brings new social and ethical challenges. Important public values are at stake, closely linked to fundamental and human rights. This paper focuses on the public value of autonomy, and shows that digitization—by analysis and application of data—can have a profound effect on this value in all sorts of aspects in our lives: in our material, biological, and socio-cultural lives. Since the supervision of autonomy is hardly organized, we need to clarify through reflection and joint debate about what kind of control and human qualities we do not want to lose in the digital future.

If interested, you can purchase a copy here: https://www.igi-global.com/article/the-new-digital-wave-of-rationalization/258848.  Otherwise, we hope you enjoy the Jort!!

Support the show

The Calculus of IT website - https://www.thecoit.us
"The New IT Leader's Survival Guide" Book - https://www.longwalk.consulting/library
"The Calculus of IT" Book - https://www.longwalk.consulting/library
The COIT Merchandise Store - https://thecoit.myspreadshop.com
Donate to Wikimedia - https://donate.wikimedia.org/wiki/Ways_to_Give
Buy us a Beer!! - https://www.buymeacoffee.com/thecalculusofit
Youtube - @thecalculusofit
Slack - Invite Link
Email - nate@thecoit.us
Email - mike@thecoit.us

Jort 1 - Final - Audio Only
===

Nate McBride: [00:00:00] Hey Mike, how's it going? Hello Nate, how are you? Good, good. Hey, I know I thought we would do like, um, I had these two articles that I've been banging around, like every single episode script that, um, I just keep deferring, you know, oh, we won't do it this week. I'll do it next week. And then I copy and paste my thing into the next week's script.

And then, uh, not this week. We'll do it next week. But I figured what we could do is just talk about these two articles. 

Mike Crispin: Sure. 

Nate McBride: Uh, one at a time. And, um, You know, kind of see what people think about it and sort of get our own thoughts on it. Um, The two articles, uh, that we're going to talk about, and we'll do them in one at a time, is the first one is the New Digital Wave of Rationalization, which is subtitled The Loss of Autonomy.

Uh, it was written by Leamber Royekers from the Eindhoven University of Technology in the Netherlands. Um, and this is from, uh, the January to June 2020, uh, issue of that [00:01:00] magazine. And the other one, um, is from Adrian Menge from the Freidrich Schiller Universität in Jena in Germany. And it's called Digitalization of Work and Heteronomy, uh, and this is also from 2020.

So, um, yeah, I, I think we can just do two, you know, just, we're going to, I'm going to read a sort of summary of the article, uh, of the, um, Roy Akers article, and Tell you sort of what's going on in the article and then you gimme your thoughts. I'll give you my thoughts and then we'll, we'll see what we get.

We'll see where we, where we end up. I mean, the point of this is, you know, every single episode we're getting deeper and deeper into autonomy for, um, IT leadership and strategy decision making, and particularly about the loss of autonomy and how to build trust and, and all the things that come along with autonomy in it.

And these two articles, particularly of all the research I did stand out. Uh, and we'll find out why momentarily. So, first of all, um, so this digital [00:02:00] technologies and human autonomy idea that, uh, Roy Echres has, he, he, Roy Echres identifies five digital technologies that significantly impact autonomy. Okay, this is a summary of the article.

It's the five digital technologies that impact autonomy significantly. One is, uh, IoT. Uh, Roy Ekers stipulates that, um, with its interconnected devices and vast data collection capabilities, IoT enables extensive monitoring of individuals and environments, raising concerns about privacy and the potential for a chilling effect on behavior.

This constant surveillance can lead to self censorship, reduced authenticity, and social conformism, as people become more aware of being observed and tracked. Um, And you know, when you and I were talking last season about the loss of autonomy, I was asking you questions about like, what would you be willing to sacrifice in order to make your life easier?

And IOT, for better or for worse, plays a major role in everyone's lives. [00:03:00] 

Mike Crispin: Absolutely. 

Nate McBride: In terms of how much autonomy we were willing to concede just in general life. And of course that translates over to business. The second point was robotics advances in robotics, particularly with the development of AI robotics have led to increased automation and the potential displacement of human workers, potential being the keyword there.

This is 2020 mind you. This shift towards autonomous systems raises concerns about job security and the dehumanizing effects of outsourcing traditionally human tasks to machines. The author questions whether excessive reliance on robots, especially in areas like caregiving, might diminish meaningful human interaction and erode essential social skills.

Yeah, we'll come back to that one because I have some thoughts on that. Number three was biometrics. Biometric technology such as facial recognition, fingerprint scanning, and emotion recognition enable the collection and analysis of sensitive personal data, raising concerns about privacy and the potential for discrimination.

The use of biometrics for surveillance and identification purposes raises [00:04:00] questions about the control individuals have over their personal information and the potential for these technologies to be used in ways that infringe on fundamental rights. Yeah. I did not go back and try to find the source article about the recent wave of police arrests using, um, facial recognition technology to capture the wrong people.

Sure. But it's, it's a great recent example of this. Number four was persuasive technology. So persuasive technologies are designed to influence individual thoughts, feelings, and behaviors, often in subtle manipulative ways. Um, we don't have to look farther than the big analyst firms for IT to see this happening, uh, all the time.

These technologies ranging from fitness trackers, social media platforms can promote specific actions and choices raising concerns about technological paternalism and the erosion of individual autonomy. The author questions whether individuals are fully aware of how these technologies influenced their decisions and whether they have [00:05:00] meaningful control over their exposure to such persuasion.

And then lastly, digital platforms, digital platforms, such as social media and e commerce sites use very, very extensive algorithms to personalize content, recommendations, and services, leading to filter bubbles and the reinforcement of existing biases. And you can refer back to my conversation earlier.

Many times I've referenced industry 5. 0 by, by Reng Wang. And how we're getting toward data filtering on a global scale. Um, every day, we grow more and more towards that. What Roy Garcia is bringing this forward, he's saying, the author highlights concerns about the lack of transparency in how these algorithms operate, and there's no transparency, making it difficult to understand how decisions are made and who is accountable for potential harm.

They also express concern about the power these platforms wield in shaping public discourse, influencing individual choices, and I'd be fascinated To know if there was a counter or [00:06:00] continuation of this article in 2025, I think would be vastly different. Uh, and so the, the, the bottom line here is Royerkers argues that these digital technologies, these five technologies have the potential to significantly erode human autonomy, especially when humans or, or individuals are placed on the loop or out of the loop of decision making process.

They emphasize the need for careful consideration of the ethical and societal implications of these technologies. Call for call for proactive measures to safeguard human autonomy in the face of increasing digitalization. Okay, so that's summary of those articles go. 

Mike Crispin: Yeah. I think, um, any digital technology that we have today measures us and can be easily monitored and captures data.

So that by itself, I think is, is pretty clear for, for the last. Many, many, many years that anytime we use a tool that [00:07:00] connects to the internet, we're, we're probably being surveilled or measured in some way. I think definitely. And that's where the feedback loop comes in between physical and digital worlds.

If you're in the physical world, um, a lot of people will say, this is off the record. If you're not being tape recorded or surveilled somehow, um, you feel comfortable in person being yourself or sharing information. And when you flip over to the digital world, I, I think there's maybe not everyone, but more people are aware that.

What they're saying can be heard maybe by someone else. And that's the privacy risk. The loss of autonomy there is that that information gets fed somewhere and into probably a tool that you use every day. That's helping to influence people's decisions. I don't think there's any stopping it. And I think there's a element of this is.

Sort of the path we've all chosen by digging ourselves into [00:08:00] all these technologies, um, that are often free, uh, social media is bigger than ever, Google search is bigger than ever, uh, Alexa is in everyone's house pretty much, or some form of it, um, we all have extensions of our brains and our Cell phones that know our location.

We've bought in. And I think that's the thing is it's a it's a problem. There are a number of who wants to solve it. And I think that's the question I have is, is who really wants to solve this problem? And is it a problem that can be solved? Or have we gone too far into the nether to actually do anything but raise awareness?

So people make it. Their own decisions as to when to use the tools, because I still think they have some autonomy to decide when and what to use where I don't think they have maybe all the knowledge is that there are a number of [00:09:00] businesses that are trying to, to build models that are, uh, some are nonprofit to protect people's privacy, but they're.

They're just not as good, and they're not cheap, and one, and the article, I think it actually goes into some detail, and maybe the next article we're going to talk about, is that, uh, you don't get those for free. You've got to pay for privacy, and you've got to pay for, we talked about paying for data before, everything comes at a price.

So I think that the new digital wave of, of all these things isn't really that, and I know this is a 2020, you said, right? I think this article is that it's probably even more onset now where we've got chatbots and soon to be automation that's going to do all these things for us. And, um, whether they know it or not, people are going to use it.

I think even the people who know, they're like, ah, this will get me by for another day. So I'll just upload this document and I'll get an answer and [00:10:00] not thinking about where that document goes or if it's being trained or any of those things. It's just, I just need this thing now and now, now, now, and they go to AI and they ask and.

It's easy to use, it's very personable and the robotics piece is going to get more and more available where I can have robot friends or robot boyfriends or girlfriends and all this stuff is going to emerge in our lifetimes, I think, Nate. So the question is. Um, did people care enough not to use Gmail or not to use Google search or even AOL?

I mean, no, it was cool. It does what I needed to do. Um, and it's, it's too sweet of a deal for people to worry about the surveillance. Component, which is too sad. I think from this, that it's not that doesn't appear to be that important or the three or four biggest and most [00:11:00] wealthy companies in the world wouldn't be making money hand over fist on our information.

And I think we've all either fallen for the trick or there's a lack of, you know, real interest around or concern for people about. 

Nate McBride: This will tie nicely into our next episode, uh, the next Jorth that we do, where we cover, um, the heteronymous angle of this, but for, for, I agree with you, I think that there's really, let me be careful about how I say this, it's not that there's no way to stop, it's not that there's no way to continue getting worse, if you very, very, like, Again, the author, Roy Eckers, uh, says you need to emphasize the need for careful consideration of the ethical and societal implications of these technologies and call for proactive measures to safeguard.

Well, human beings are just, have become so, um, uh, [00:12:00] so willing to give up autonomy. Yes. For, for, uh, what they consider to be a temporary or potentially even a short term improvement to their life. Um, you know, that, that sort of It's a quick payoff. Dopamine. Dopamine. 

Mike Crispin: I mean, people, they, they love to post online and get a few likes or a few agreements.

It feels good. It's human. You know? Yeah. Someone pats you on the back or agrees with your point of view or, it feels good. And it's addictive. Um. And someone can get that out. And that's. For all it's, uh, for all its downfalls and risks, it's, it was genius because it completely, it's, it's still addictive today.

It's bigger than ever. 

Nate McBride: So we're, when we, when we translate this over into the corporate world, I mean, you and I, IT leaders, I mean, we're thinking about. Not only how we're affecting our employees the same way and how we're applying [00:13:00] proactive measures to, um, you know, sort of keep usage down to what seems to be the appropriate or most effective level, um, you and I ourselves have to be careful about how we're influenced and our own autonomy and how we're impacted by our decision making.

And that's something that I have to think about a lot more these days than I ever did before, which is, um, Again, we've talked about this, but when I make it, I'm making this decision. Am I making this decision on my own? Or have I been influenced in making this decision? And if I have, how? And was it so insidious I never saw it coming?

Or am I truly making this decision based on objective and empirical facts? Um, something to think about. 

Mike Crispin: It's, it's difficult. If you don't think about it, it can just happen kind of automatically and not realize you've been influenced by something or thought it was an easier path. And you don't really think about the kernel of where the seed, where that all came from.

You end [00:14:00] up, maybe a few months later, almost, uh, Not realizing how you got where you are. 

Nate McBride: Well, I appreciate your opinion on this. We're going to do another jort, uh, about a different article. And so if you're interested in what Mike and I was talking about now, turn, tune into the other jort so you can hear the sort of other side of the story.

Um, but thanks for, thanks for tuning in. Thanks Mike, for your opinion on this one. 

Mike Crispin: Thank you. This is good.


People on this episode