Will AI Save Or Enslave Us?

Is AI to be our Saviour or Enslaver?

Every great story is a battle of good vs evil. The question of what role AI plays out in our future is yet to be answered. What is critical is who gets to teach AI.

Will it exaggerate the flaws within our own systems or will it bring more humanity?

I gathered together a group of people brighter than me to think this through.

Giannis Chatzidis Chief Data Scientist and AI Coach

Muhammad Mehmood – Ex CEO of Tech company

Neil Harrison – Adaptologist and Change Professional

Paula Anastasiade – Change Management Consultant

Listen in to see the potential pitfalls and benefits AI could bring.



Rob: [00:00:00] AI has enormous potential but it seems very early to be usable yet.

Rob: My interest in this was when Neil and I were talking, he was talking about how what’s going to happen with AI. It made me realize the distinction that humans have is emotions and it’s going to be emotions that make decisions. So what’s going to happen is the emotional intelligence looks like it’s going to be more and more important because that’s the element that humans can provide.

Rob: Neil is an adaptologist. He’s worked through change management and feels that organizations need to be more ready for change. 

Rob: I’m going around in my screen now, Giannis is a coach who is using AI In merging tech and human coaching to help people gain more self awareness and become better leaders.

Rob: Am I right there, Giannis? 

Rob: Muhammad has more degrees than I have GCSEs. And among his like nine degrees he [00:01:00] led a tech company coming from an entirely different field, not just content with leading, but decided he was going to learn how to do it and did another degree in tech. Muhammad seems to be across everything. I thought he would be someone who would add something to the tech side, but also in implementing the human side and initially I was thinking of all people involved with tech. And then I realized, hang on, we’ve got to adapt the human side to this.

Rob: Paula is a change manager. I’m not sure how technologically advanced you are Paula, but I know that you’re very in tune with organizations and the way that they need to change in order for change to be implemented. That’s really the the background for everyone to have an understanding of what everyone else is contributing. 

Rob: Neil I think you’re have a strong opinion on this. So I don’t know if you want to start off. 

Neil: Am naturally optimistic about this and in what I’m about to say, I recognize there’s always a tension isn’t there in all [00:02:00] good stories.

Neil: You’ve got light and dark, good versus evil and all that. All that is at play. But I, at an optimistic level, I think actually AI is going to help to remove. a lot of those jobs for the human that we just don’t like doing, frankly, and the sort of mundane jobs that we tend to face day in, day out. And through AI technologies, I think by removing that mundane activity for the human at work, we can create a space that allows humans to really excel at the things that are uniquely human.

Neil: AI isn’t going to make, it’s going to help you make decisions. It’s not going to make decisions for you. It might help you with ethical decisions. decisions, but it’s not going to make ethical decisions for you. And so if we can use AI in a way that frees up our time for those more human traits, those cognitively challenging traits, actually, not only do we remove the [00:03:00] mundane we free ourselves to work on the innovation and the creativity.

Neil: Most people I think are going to enjoy it work. So we find ourselves, enjoying work much more because of the creative and innovative and the challenging work that we’re doing, which is rewarding. So in a nutshell, I think removing the mundane and freeing people to be genuinely human at work, and I’ll just quickly pick on up on the one point you made in the introduction is that in doing that, and I think in building innovation and creativity, what we need to do is recognize our emotions at play.

Neil: So we are cognizant of how emotions play into good dynamics, team dynamics. So we’re inclusive, we embrace diversity. We’re considering other people’s emotions in the room, those sorts of things. So we are giving everyone a voice in a way that allows for that innovation and creativity. And leaders, I think, forget.

Neil: This sort of concept of having all the answers, [00:04:00] but pushing decision making down. Again, very rewarding, but that allows for speed in decision making. And so leadership roles, I think we can see evolve over time. It might be harder for leadership to to make these kind of changes. But I think I see a future when AI with AI, removing that mundane work, actually will become a much more enjoyable place.

Muhammad: If I can add here to what Neil is saying I am in alignment, Neil, what you are talking about. I have a few additional points to that note, which slightly deviates from what you just said in the end of the conclusion. One is that even AI or call it artificial intelligence, it’s us as humans who actually built this in the first instance, right?

Muhammad: So there is a human behind this chat GPT or, whatever you call it. Now I’ve seen both sides of it from the tech perspective, let’s say AI has really revolutionized certain industries. And I’m going to just talk about my very own industry, which is [00:05:00] hospitality. And the first time when I came across that use of it was about five years ago, when we realized that by using these clever algorithms, we can optimize a delivery route for the driver to reach the destination earlier, find the quickest route.

Muhammad: So we can connect with Google, we can connect with other street maps. So it’s like an integration through AI, which is a kind of huge plus. Then there is another element which I’m part of or was part of is the AI or voice ordering. So before that, let’s say you used to pick up the phone, let’s say your local pizzeria is Domino’s or Papa John’s, and you speak, talk to a human and you tell them what you want and they will, sort you out.

Muhammad: Now with this AI voice ordering we are removing the human element because that’s like a robot is entertaining you and Even if you are saying, how are you? It’s just gonna say respond back in a robotic voice. I’m fine. And how was your day? Again, that’s fed some by someone there is a data So whatever ai tells you even on chat gbt, you know ask a question There is a [00:06:00] data source behind and if you see on the screen, you know. So i’m sure all of us have used chat GPT for various reasons.

Muhammad: But if you ask chat gpt about something, there’s a cursor which is like blinking and it starts typing. The reason for this is that it’s like a visual reflection that as a user, you are expecting this chat gpt to answer you back. And visually, it’s when this cursor is moving slowly, It’s actually, you’re getting some answers, but what it’s doing in effect is behind the scene.

Muhammad: It’s going through a cache of data source, finding the commonalities and then forming up a kind of a sentence and throwing out the information back. So AI has huge benefits, but I still believe that because I’m a strong promoter of empathy, compassion, human interactions, human connections, but the more we are getting into this, I think with the more further we are getting away from the human connection. 

Muhammad: We are detaching ourselves. We are detaching our emotions. And especially the most interesting part, which is still [00:07:00] in its earliest days is use of AI within the medical field. I was reading an article the other day about not talking about robots doing surgery, this already in that filing this they’re making huge inroads in that, but a differential diagnosis where you have a portal, you log in as a patient or customer, we call it, You start adding like what symptoms you’ve got.

Muhammad: There’s no doctor behind the scene. It’s just an AI engine. It’s evaluating your symptoms against a certain pattern or database and it gives you or spits out what we say and information based on differential diagnosis and then it’s suggesting the next course of action. 

Muhammad: Now at this moment in time, It’s not suggesting you to go and, it can’t prescribe you, let’s say antibiotics in UK, you still have to have a human interaction there.

Muhammad: But I think the trend is getting there, and even as a medical person, I agree, and I’m happy that we are moving that direction. But what we are removing [00:08:00] is that, for instance, during COVID, people were really very much hoping to have, to talk to someone, being a volunteer, I’ve seen that, when I used to pick up patients, from different locations and they were really dying to have a conversation with a human.

Muhammad: So again, I think that’s probably the negative side of things. But yeah, that’s my take. We are going further and further away from human connections. What

Paula: Muhammad just said made me remember something that I saw, I think the other day, there was this one company that advertised its services or rather its differentiator as We don’t put you in contact with robots. We actually have humans answering your calls. And I thought it’s just so interesting because we still think that bringing AI is a key differentiator.

Paula: In fact, the way things are going, It’s showing us that the opposite is true. It’s the human aspect that is becoming more and more of a key [00:09:00] differentiator. I thought it’s just so interesting and in a way paradoxical. It’s interesting to look at that. I’m also with both of you, Neil and Muhammad, and there are certain things that, that you mentioned, which I took some notes of because They’re very interesting to look at, for example, the ethical aspect of it all.

Paula: We could probably do an entire session just on that, but thinking about innovation and creativity, it’s true. And I really liked that expression, removing the mundane. It’s true that innovation and creativity have more space to grow when we use AI wisely. 

Paula: The question is, does everybody’s job involve innovation and creativity?

Paula: Because it can be argued that some people, yes, they need to be innovative. They need to be creative because that’s what their role is about. For some people, it may not be the case. And then for speed and decision making, I think that’s a really interesting point because at least in the change management world, and Neil and I [00:10:00] have exchanged on this topic.

Paula: The quality of decision making and the type of decision making around various types of organizational changes leaves no room for people involvement. 

Paula: In other words, it doesn’t really enable people to co create. So much so that a lot of organizational changes, at least in my experience, most of them, are made in a top down manner.

Paula: They often leave people thinking that they’ve been made quite hastily. So I’m wondering, do we want more speed in decision making or do we want more quality in decision making? There are a lot of questions that we can raise when it comes to the quality of decision making around organizational decisions and organizational changes in particular.

Paula: But if speed comes at the cost of quality, which is already questionable in many instances, I don’t know if I’m as optimistic. I wish I was, but I don’t know. And yes, like Muhammad said, a lot of people are just dying to have a human [00:11:00] to find a human on the at the other end of the line. Just think about the many times that we call our banks for them.

Paula: Usual with usual questions, and we find ourselves having to repeat the same thing three times over in the simplest of terms, and it’s messages, just doesn’t get through. And sometimes you actually get disconnected altogether because you haven’t supplied your query in language or in a way that the algorithm can understand.

Paula: So there’s a lot of that. And one last thought whenever and also to Muhammad’s great point on how humans are getting removed from a lot of things more and more in my experience with projects with organizational changes that aimed precisely to have certain humans replaced by let’s call them generically machines, because that’s the general perception.

Paula: There is one pressing concern. I would say the number one concern of the stakeholders I’ve been working with and for, and that [00:12:00] is our jobs are going to get cut. And while that sometimes is true, sometimes it’s not. But the fact that the leadership team does not have a clear vision for how these people’s jobs will evolve for precisely what Neil was saying, the extent to which their jobs are about to become more innovative, more creative, that leaves a lot of room for fear.

Paula: And so there’s a lot of resistance that builds up. As a result of this concern, this very legitimate and important concern being left unaddressed.

Neil: I’ve got quite a lot to say about that, but I’m conscious that, you might want to 

Giannis: yeah, I think give me a great pass because I’ve been writing code for the last 15 years, and the last one year there is a great tool out there that can create write code faster than me better structure than me.

Giannis: Because I’m a lead data scientist and there’s always two sides of the coin. The first one is I write code faster, so I have a better, a really good help and supporter to [00:13:00] get my job done. But on the other hand where is my creativity? Where is my emotional intelligence enhancement in order to write, the correct words in order to think out of the box in order to enhance my problem solving skills.

Giannis: I read recently, an article about the top tech company, and they stopped the access, Chat GPT, to all of the employees for a month, and the productivity dropped by 65 percent for a month. And I’m really wondering, is it because those people cannot live without GPT, or, for example, GPT is part of our lives, or GPT has already replaced our, some human skills, like creativity, like problem solving efficiency, like speed, like way of thinking or something.

Giannis: And from my side, I really want to say that I’m not afraid of AI. It’s something that as Neil said before, it’s like a great tool [00:14:00] to automate things to make them faster. This is super amazing, also in medicine, also it can be applied in every sector around the world but I’m really afraid of how people are going to use AI.

Giannis: And and I think this is an open question. Are we really afraid that the robots are going to replace human beings? 

Giannis: Or that we are getting closer to robots with our behavior, meaning that, this is what Muhammad said before, it’s like a super common question in the workplace. How are you?

Giannis: And if you ask, how are you, it’s going to reply, great, thanks. How about you? 

Giannis: Who you see ourselves in this reply at the workplace nowadays is like, are we becoming more robots in our relationship? How are you? I’m great. This is it. And the discussion is over. Thanks for asking. How are you? No, not bad.

Giannis: Okay. Where are the emotions? 

Neil: I think there’s quite a lot in that. Let me see if I can unpick some of the key points that seem to me in that. [00:15:00] I think we’re, we live in an environment predominantly where I talked to Rob, we talked about industrial age ways of working.

Neil: For the most part, we work with organizations. Leadership is expected to make the decisions. Middle managers are seeking out efficiencies and productivity gains. And then, front end workers work in their socks off. And I think so that, that is the context in which we are introducing technologies as they stand today.

Neil: So if we look at digital transformation, we have consultancy firms IT providers, that implement. digital transformation, usually born out of a single product. And the driving factor for those things tends to be cost optimization, greater efficiency or effectiveness. So that driver isn’t thinking about what can we then do with the time that we’re saving.

Neil: So quite often what is digital transformation means we can save headcount, reduce cost. [00:16:00] Now that’s fair enough, that makes good commercial sense. Where AI makes a difference is the we are seeing a pace of change in AI. I think that we haven’t probably seen for quite a while.

Neil: And in organizations that are hierarchical strategic decisions, big technology programs taking 18 months you’re investing millions of pounds in a solution that, Might not be as relevant in 18 months from now. So I think actually thinking about this in the context of the pace of technology development.

Neil: And you think about also actually customer demands and the much more tailored and personalized expectations of customers, I think requires a different thinking. So standard solutions in digital transformation programs probably aren’t going to cut it if you’re not getting that until 18 months from now.

Neil: So in order to make the most of AI and develop some of that technology, I think we just need to think [00:17:00] differently about the role of leaders, differently about Great. the cultures and the behaviours in which we operate in organisations, and definitely about the timescales with which we operate to improve the productivity and and efficiencies, but there’s another thing I think, and to me, it seems like AI starts to introduce concepts we’ve not even thought of before.

Neil: So new products and services, new U. S. P. S. For organization even change the market. I was thinking about my mobile phone the other day. I can remember when, I’ve got a filter on, so I look a lot younger than I am, but I remember when mobile phones came out. I thought, why would I ever want a mobile phone?

Neil: If I’m at the house, Yeah. Why would I want to be contacted? And we think about how we use mobile phones now. I only make a call online. So I think we couldn’t have dreamt about some of this stuff, 20, 30 years ago. I think AI has a potential to reinvent what it means to be at work.

Neil: But of course, that’s not going to happen with today’s thinking about how we operate.

Rob: It seems to me as I’m listening [00:18:00] it seems qualitative change. So I’m looking at what really drove the industrial revolution. Was specialization and specialization hasn’t really happened in knowledge work. We’ve taken the structure of industrialization and, but we haven’t really specialized and broken up.

Rob: So if someone is creative, like advertisers or analysts or whatever, still do much the same that they did 50 years ago. So if you’re a writer or whatever, I don’t think we’ve really broken the elements of writing, the elements of different tasks.

Rob: And I think that is the key breakthrough for the industrial revolution. Where I look at computers haven’t really made us any more productive in terms of the way that machinery made us 50 times more productive, computers haven’t had that impact. And it’s really because what we’ve done is. Ford and Taylor and all of those people who said, okay, let’s look at how it’s [00:19:00] done.

Rob: Let’s break it up. Let’s look at how can we make it much more efficient. 

Rob: I don’t think we’ve ever done that in clerical work. I don’t think we’ve ever done that in creative work. And, but that is the thing. When you say that AI is going to take away some of the menial tasks, it means that people can really focus on in whatever work you do.

Rob: There’s stuff that you really enjoy where you really get in flow. 

Rob: And then there’s the drudgery tasks that you have to do to be entitled to do that work that you love to do. So I can see where AI has that impact. What we need to be doing is looking at organizational structure, and I don’t think it’s just a change.

Rob: I think it’s a revolutionary change. I also think it’s a change in what it means to be human. I think when we were nomadic. There was a distinct change to becoming agricultural. There was a complete change in the way of life. And some people chose not to be like that. There were still tribes that were still nomadic.

Rob: When we were agricultural, there was a change in to become industrial. And some people chose, [00:20:00] opted out of that. Some people stayed in the country. 

Rob: Now, I think there’s a change from industrial to digital. And what that means is that we have to change what it means to be human. And not just in person, but how do we make connections, find our tribe digitally?

Rob: How do we interact? How do we form connection? 

Rob: So I think there is a dramatic change in that we have to look at technology. Because what I can see is there is so much fear and there is so much frustration. I can remember there’s a couple of instances where I’ve been stuck in some AI help desk and I just need to speak to someone because it doesn’t fit into any of those boxes and there’s no one to speak to.

Rob: And you just roll around in one loop to another when it becomes so frustrating. For people who aren’t technologically adept, literate, whose work doesn’t involve that, I can see, I can really see that there’s the fear of losing jobs. And I see a very much a [00:21:00] analogy with the Tolpuddle Martyrs, where people were wrecking it.

Rob: And when I see films like The Matrix, 1984, Divergent Demolition where you can see movies tend to play out the fears that people have, and it used to be, like, way back, it was the fairies and the trolls in the forest, and then it was aliens and then it was zombies and now, what we, tendency is this kind of surveillance and this AI is going to become, they’re the enemy.

Rob: So I think organizations are really going to need to manage that change, but also whenever there’s a change, like we’ve had the capability to change online shopping 20 odd years ago. And it took COVID for remote work, for online shopping to really take off. So there often needs to be an event.

Rob: Even though the possibilities of AI are there, the what it takes for people to take it up often needs a trigger. In order for them to accept it. Yeah, there’s lots there, but it’s about how do we [00:22:00] navigate this new world and become digital and how do we not leave other people out?

Rob: And deal with the fear of what it means I can remember since, when I was about age to enter the workforce where there were fears where there isn’t going to be jobs for everyone, there’s, manufacturing is dying and there’s going to be no jobs for anyone. 

Neil: My sense in the short term, at least that there will always be jobs that can be replaced with technology. Largely what’s driven digital transformation programs for quite a while, but I think the event that will that will make a difference is in the main for knowledge workers.

Neil: It won’t be AI that replaces them. It will be people that are interested in exploring. what AI can do will replace them. So when you get that augmented, human AI collaboration working together, you actually get the best of what that can bring, I’m interested in Muhammad’s view on this because from the things I’ve read [00:23:00] around how AI is, and not a great deal, but AI is implemented in the health sector.

Neil: I’ve seen some great stories around how AI is able to sift through lots and lots of data and spot particular, calls or symptoms to problems that wouldn’t have otherwise been picked up by very busy. doctors and nurses. So an ability to use the capability in that way that frees people’s time to then have that more personal touch, actually.

Neil: The personal touch, I’ve been to hospitals a few times myself, so it, I’ve seen how busy they are. If we can remove a lot of the busyness, And improve efficiencies, actually, don’t we create more time for that human interaction and things that actually really matter as well as improve diagnosis and so on.

Paula: Perhaps this sounds a bit conservative and maybe even old fashioned, but I think that to make good use of any tool, including AI, if you want to look at it like that. One has to have a solid [00:24:00] intellectual foundation, I would say, which is why we need people with good critical thinking, people with solid decision making in, of course, key positions.

Paula: Because as Muhammad said, there’s always a human behind these tools, right? That’s why we keep hearing and maybe reading about how AI works on the basis of garbage in, garbage out. And for us to be able to do our jobs better using AI, first of all, we need to be better about certain things that we need to do because AI will not fill those gaps for us.

Paula: AI will not come in and make those decisions around ethics that Neil talked about very rightfully called out in the beginning of our conversation. AI will not fill in those critical thinking gaps, at least I don’t think so. And not to mention the emotional intelligence aspect of all of that. So yes, I also believe that a human without AI can be replaced by a human with AI provided that the human with AI uses it in [00:25:00] an intelligent and a wise way. And I’m saying that because as a change manager, I have to think about AI from two perspectives. One perspective is, of course, that one that has to do with the people who are going to embrace “AI” and their perceptions and their potential resistance. 

Paula: In other words, that’s the perspective of the people that I’m serving as a change manager. But the other perspective that I need to be mindful of is how AI gets used or is likely to get used more and more in the change management world by change management practitioners and change management is still full of misconceptions. 

Paula: We still talk excessively about models and processes, a number of steps to take a fixed recipe to follow, and boom, you have the change. It doesn’t work like that. So we are still fixated on these things instead of talking about more useful aspects of change management, such [00:26:00] as behavioral science neuroscience and a lot of other useful things, more useful things than a fixed recipe. 

Paula: So my concern is that we superpose AI on this foundation of misconceptions or incorrect assumptions, incorrect approaches to change management. And what do we end up with I’ve been looking at interchap GPT and another type of GPT that is specific, that was specifically developed for change management.

Paula: And I really wanted to understand, and I’m still interested in this aspect, how does AI really amplify change management? 

Paula: I’m again, left with the conclusion that if you put in garbage, that’s what you get out of it. Of course, AI is very helpful on certain levels, but if you lack the the foundation and if, in other words, if you are not let’s say well developed to the point that you are able to make the difference between something that makes sense and something that doesn’t, you are just going to be tempted to [00:27:00] take whatever your chat GPT or whatever change management GPT gives you. 

Paula: You are going to take it for granted. And there’s a lot of I’d say inappropriate, ineffective content that chat GPT or any GPT for that matter can provide you with. That’s why I fully resonate with what Neil said. A human with AI is likely to be more effective and less easily replaced compared to a human without AI, provided that the human with AI is able to use it in an intelligent, competent and wise way, not as a tool for cutting corners.

Giannis: Also, I think has AI replaced busyness? 

Giannis: In reality, okay, probably it has replaced a lot of tasks and it’s added a lot of automations inside, so we can save a lot of time, but I think has increased the feeling of depression to people and also to leaders as well, and has increased the feeling of I’m not enough. I’m not enough to lead. [00:28:00] I’m not enough to find solutions, and I’m not good enough to lead the team. I’m not good enough to find a solution because everyone can ask for, a fast solution to chat GPT probably better than mine. So what kind of role am I playing here as a leader?

Giannis: And and just to continue this email of Neil before about 20 years ago that We didn’t have mobile phones. It was like industrial revolution. And 15 years ago, GPS gave me, and it was like a second industrial revolution about nowadays, because I’m thinking I cannot go anyway, without a GPS, even in my city.

Giannis: And before 20 years ago, we had maps and we say, okay, I’m approximately here or there, blah, blah. And now the first move before going out of home is to take our phone. The first move entering our car is like we enable our GPS. And now the first move [00:29:00] having let’s say a problem is to ask the GPT, probably in 15 years, the first move going out of home is to take together, our dog robot, the pet robot. With us, this is the future. 

Giannis: Yeah, but in general, I think it’s like super, super nice in mind to think about how our lives were before GPS, how our lives were before chat GPT and all of these algorithms in general, and if in reality AI has really replaced busyness and we have increased our free time with loved ones, with our family, everything that it’s really beneficial with us, or in reality, we feel, more alone than ever, or more not enough than ever, especially in the leadership field.

Paula: Yeah, that takes me back to Neil’s point about, okay, we are saving time. What do we do with that time? 

Paula: Do we use it to spend to, do we spend it with our loved ones or do we engage in more meaningful [00:30:00] activities or do we just like to say, Oh, we’ve saved time for the sake of saving time. 

Paula: That’s a great point, Giannis.

Giannis: Yeah, it’s, and again, it’s same thing with the LinkedIn that Rob said in the beginning with AI commenting is do you leave a comment just for commenting for the sake of commenting or because, blah, blah, blah. Or have you just read something insightful, meaningful to you and you really want to leave your opinion below this content?

Paula: My theory, but of course I don’t have any proof. That’s why I just call it a theory. And even that is a bit too much, that people who use AI to post comments on other to other people’s posts do so for a very simple reason, which comes to us from behavioral science, which has to do with reciprocity.

Paula: In other words. We comment because we know that person, if they work based on the normal rules that humans work based on will reciprocate. But, of course, the quality of the content, the quality of the input they [00:31:00] provide that that we can comment about. 

Neil: It’s a really interesting example of Where we see a distinction between what we can do with digital tools, AI, and where we need the human interaction.

Neil: So if I read a post that it looks to me like it’s been copied and pasted from GPT into a post and somebody hits go, actually my trust in that individual drops. 

Neil: That’s not a value add activity. And if you see comments, which I think is even worse with comments, frankly, but what we’re seeking, I think some of us, if we think about the outcomes we’re trying to achieve, some of us, everyone on this call is seeking to gain insights, relationships, thought provoking engagement, learning and sharing.

Neil: But you can spot that through the engagement. Other people that are posting through GPT comments are not adding those things and you can spot that. So what outcome are they achieving? It’s certainly not the human dimension [00:32:00] of interaction. Actually it for me reduces my trust in those people, which is counterproductive to my mind.

Neil: But I think there’s a macro point here, which is around what is it we’re trying to achieve? I love this idea of being less busy. If we’re constantly driven to use AI in a way that is seeking to remove headcount from organizations. That’s probably not the organizations that are going to survive in the long run, to my mind, because I think digital technologies have been around a long time.

Neil: What I think we need now are people that can imagine a future, a different future, to create that revolution in how we operate. To my mind, that’s got to come from humans.

Paula: That’s why I’m really concerned about the fact that digital transformation programs that are designed for executive leaders approach the topic of AI almost exclusively through this lens of how to outcompete others using AI. And so it’s not about what you said [00:33:00] unfortunately.

Paula: How do we imagine a different future? How do we leverage technology to imagine a different future? 

Paula: Actually, that’s what they think they are doing. They are imagining a different future for their business using AI, but not necessarily for the people that are making that business successful. And that’s what I find really troubling.

Neil: I think that’s what will make the people leave. 

Muhammad: Just want to add what Neil said and what Paula was just saying. My personal view is that. AI is also obviously a technology and we have become slaves of that anyways, good example being iPhone or smartphone, right?

Muhammad: And I believe that this is my kind of take on AI, which has a positive element and a kind of a not so positive. So if I speak about positivity about Neil, you mentioned an example that, somebody got diagnosed better because, the GPs have more time and, it’s data driven, obviously.

Muhammad: That kind of connects to what Paula mentioned about speed versus quality, so you have a good example. It’s one of the example of quality decisions, right? 

Muhammad: Having said that, [00:34:00] what is implicate happening now within the NHS sector is because NHS now is aware that this tool can be utilized. So what they are doing is obviously our NHS is in crisis, which I’m sure can do a separate podcast on that.

Muhammad: But a number of GPs within the surgeries are actually getting reduced further, which is quite alarming report that is coming out quite recently because of the AI stuff coming in. On the other side, I would say that we as humans are equally guilty of bringing AI into the equation. Why? Because when there is a need, a demand builds up, then somebody has to supply that demand.

Muhammad: And I’m going to link it to the likes of the big players like Amazon, for instance, right? 

Muhammad: I’ve got literally their, one of their dispatching unit, about 10 miles from where I live. And it’s all AI driven, it’s all robotics, and obviously there’s a headcount reduction, and I think probably Paula, you mentioned about it’s about the business take, and I’ve got good example of a business I know of, where they have this used to have a door production facility, and there are 30 odd people, but now [00:35:00] they have robotics, and there’s only one person who supervises, where the rest either made redundant, or, shifted elsewhere, and also have some companies or businesses I know of, They’re using AI to free up the time and I think that’s what Giannis mentioned is they deploy, in their customer service team, right to enhance their customer experience they have more bodies to actually physically call up the customers So they’re like good and not so good element to it.

Muhammad: When you’re hiring people, And this is my personal opinion on that you can teach anyone anything, you know about product service, but you can’t teach an attitude. So I personally look for attitude in a person. And as humans, we will never be able to teach AI the attitude and the emotional intelligence.

Muhammad: So that’s my kind of a bit of resistance and to the point of change management. And I think big companies, and I see more and more leaders who are not, trained enough to accommodate or to take everyone on board. Why? Because they don’t know what the heck is going to happen in 10 years time. So what is happening in fact is, [00:36:00] yes, the workload is decreasing, but so as the moral of the team, people tend to leave the businesses, people are either disengaged.

Muhammad: So it’s another issue, toxicity and all these shenanigans start building up, right? 

Muhammad: Politely, I will disagree about the AI is not freeing our time, actually. It’s creating more issues, which we at this moment in time, we’re not understanding that more issues. If I am a business owner, I’ve got 50 employees.

Muhammad: I have put in invested money and I’ve brought robots and everything. And I believe that my process will be efficient and everything. But the biggest asset I’m losing is my people. And I’m not looking into that element. I could be utilizing their time, the free time they might have. and allocate into a different vertical of the business.

Muhammad: In essence, or in short, what I’m saying is that it’s human greed that has driven everything. And now we are debating about the pros and cons of it. I’m in from that school of thought that whenever we want to push through [00:37:00] something, We create an environment or we create a space for it.

Muhammad: Because end of the day, you, me, everyone on this call is end of the day is a commodity, a number for big business. And they’re just collecting this data. And very good point from Giannis. He said about the GPS. Yes, exactly. But have we lost our Freedom to certain extent, yes. Have we lost our privacy?

Muhammad: Yes. Anyone employed in Google knows exactly when I’m leaving home and when I’m reaching to a destination, right? So we’ve lost that control. Which we had, the A to Z maps. I used to love, open up the map on the bonnet and I’ll figure, okay I’m stranded in somewhere in Sicily now how I would reach Palermo, but I managed to drive those times as well. I used to reach Palermo on time, I would say. Yeah, 

Neil: I understand what you’re saying. But I think these Instances are driven by old thinking. And I actually think AI creates an opportunity to be more human because those unique traits are the things that are going to make a difference.

Neil: So you and I, [00:38:00] all of us we’re customers of service centers. I absolutely can know which I prefer when, a robot at the other end doesn’t understand me saying yes or no. Now, I think that the driver for that, of course, goes back to the sort of what outcomes you want. The driver for that is we’re trying to reduce costs by putting, chatbots at the front end.

Neil: But imagine other companies. That we’re freeing their time through other things, automated note taking in meetings, things like that. And they’ve got more time. They can have more time with human interaction on your service center. Which of those companies are you going to use the individual at the end who demonstrates empathy and passion and cares about facts happened to me just the other day, find a bit of plastic in my in my bun from the shop.

Neil: I’d rather talk to another human being about that because I’m concerned about the production line over trying to, convince a robot that there is an issue. I think it’s the [00:39:00] companies that start to spot the opportunities where we get, it’s really is that more human centric approach, isn’t it?

Neil: How can we add value through our, through being more human for our customers and clients? 

Rob: On that point, often when I’ve had a problem, often the way custom services dealt with particularly in the NHS is everything is driven by rules, when it’s a NHS or it’s a large organization, everything is driven by rules.

Rob: And you can often talk to someone and they’re inhuman. It’s a human being, but they’re inhuman. Like I’ve had to take my dad to the hospital. And there’s literally no one to complain to, there’s no one to talk to and sometimes humans are constrained by the rules of a big organization.

Rob: So sometimes that’s as bad. 

Neil: Yeah, those rules. So when in service centers, those rules are put in place because it’s they’re trying to reduce the complexity for, the entry level, if you like, for [00:40:00] the employee, if you make it simple and you’ve got a set of rules to follow, that makes it simple.

Neil: Actually, in, in the simple cases, you could develop AI. And I think there’s a separate thing here about how you train AI to be more human, even at those very basic levels of chatbots. But if AI develops in a way that allows you to take care of those straightforward cases, I bought this product at the shop as well as I was talking about that, and it’s out of date.

Neil: We’ll take it back to the shop. Okay in more complex cases, you need humans, and I think if we can Again, free time that allows for the simple cases to be dealt with, actually those more complex ones where you need to understand context and the human emotion. I think it frees time to allow a deeper and richer exchange with with other people.

Rob: But I think that again, that is dependent on a certain skill level and autonomy in their decision making. Yes. Yeah. [00:41:00] I think it 

Paula: also matters where the question is coming from, because I completely agree with Muhammad’s point that it’s greed that is driving everything, and I can’t help asking myself if there’s this little group of ours, and I’m sure other people as well, Asking or considering the opportunities to become more human and humane.

Paula: Is it any good if those who are actually in the position to make decisions, don’t ask these questions, if they only focus on speed. Saving time, saving money, reducing count, head count, and all of that. I feel there’s, a lack of balance there. As always, what we’re taught in sociology is who asks the question or who sets the agenda changes everything.

Paula: It really does matter. 

Giannis: This is exactly what we discussed. I think it applies to coaching AI, for example, because recently there are a lot of companies who have [00:42:00] created chatbots with coaching to, to create some small coaching sessions for people, et cetera, because especially here in Greece, for example, there’s a lot of hesitance against coaching, it’s not like a trend or something.

Giannis: People are not so familiar with coaching like in UK or us, for example. And again, I think they are good only from the sense that, they’re there to give access to people, to coaching. It’s not accessible, like for example, in countries or in or just like Africa, really poor countries in Latin America, in Asia, et cetera.

Giannis: So those chatbots probably are super interesting for those people to have a first glimpse about, what coaching is and how coaching looks like, for example, but I think they will never replace the empathy, the compassion, the trust and safe environment that the coach or a therapist, let’s say, can create for the client, et [00:43:00] cetera.

Giannis: It’s just, a first view of, how probably a coach is going to help you. And again, with leadership, because, this is what we talked before about, there are a lot of leaders, into a leadership position without being trained enough, without being, skilled enough.

Giannis: And skilled, not experience, not age, not gender, nothing of all that. It’s mostly about human skills. Yes, mostly about, all the skills that every leader, should have. And and this is the way, there’s a super big battle between, the AI aspect of the leadership and the human aspect of the leader as well.

Giannis: And there’s a big battle inside them about, where should I go to my team? To the results, because business is like result driven. This is, will never change. Of course, it’s money driven. Every business is money, and the leader is okay, where should they go? Should they focus on results where everyone wants or should they focus on my people, on [00:44:00] emotions or something?

Rob: I think that’s really interesting, those two points. What comes to mind is Marx at the beginning of the Industrial Revolution. When Marx talked about anomie and alienation we were separating people from the means of production. We were taking away meaning, we were taking away purpose.

Rob: And Marxism has been taken as dogma and it’s been used, obviously, as communism. 

Rob: Actually Marx had a lot of, Interesting points, and he was true about some of the elements and in what Giannis is talking about and what Paula has talked about it, 

Rob: AI as I understand it, it needs I think a lot of the fear mongering about AI is going to control us and AI is going to want our bodies for its energy and all this kind of thing is that we’re ascribing human emotions to AI and AI is going to be power hungry and greedy. But actually AI is just does what it’s fed to do. It’s a tool. It [00:45:00] doesn’t have a goal of itself of its own.

Rob: So it has to be used by a goal. 

Rob: When we’re looking at companies, the real problem that we have with companies is pure greed. And it is because the whole premise of a business is to make money. 

Rob: In the industrial age, this was fine because the industrial age was to move us from a time when people had no money, literally you were one crop failing from starvation and dying.

Rob: Or being put in a poor house. Survival mattered because we didn’t have any resources, apart from the chosen few. What the industrial age did was at the cost of our freedom. And to some extent, our happiness and health we sold that out for wealth and security.

Rob: But what’s happened now is in the last two, 300 years, we mostly have a welfare system. 

Rob: When you look at the experiments of universal income, we don’t need to work. We can work from pleasure. What’s happened, that idea of [00:46:00] greed, survival has become something that we’re driven by and is creating burnout, is creating the mental health problems that we have.

Rob: It’s creating disengagement and it’s creating emotional problems. So I think the real question is, it really is a revolution in terms of, we need a change in government. As in not move from one political party to another. But the idea of democracy is, basically you have caricature people that you vote for.

Rob: And it’s simplistic ideas that aren’t really solving problems. When you look at the NHS, when you look at care, when you look at policing, they’re chronically underfunded in the UK, at least. They’re really on the verge of collapse. One incident could probably. make many of them crumble because we live in this facade that some political manipulator can make magic out of nothing.

Rob: We don’t have to pay any more taxes and yet we can have the service that this dream [00:47:00] service. And it’s not true. We need to recognize that if we want more service, we need to pay more. And which means that we need to have, but no politician is going to say, okay, we’re going to put up your taxes.

Rob: We’re going to give you a fairly decent service. And so it’s all built on lies. And until we as citizens see through those lies and we’re willing to really confront the issues. We’ve got an aging population. We can’t afford to pay pensions for everyone. We can’t afford to pay care home fees.

Rob: We can’t afford to treat every disease that we’re trying to treat unless we’re willing to pay more. So it seems to me really like what you’re talking about when you said we need to be better before we try and automate processes that aren’t working. 

Rob: I remember Elon Musk and the mistake he made was he tried to automate everything straight off. And then he realized he had to go back and go, okay, we’ve got to get make the process working, then we can scale it. And we’re trying to scale Let’s not talk about business. Let’s talk about a society [00:48:00] that’s on the verge of collapse.

Rob: When more than half of marriages are failing, when we’re NHS’s, social care police all of these systems are on the verge of collapse. How can we scale that? We’re just going to create more chaos. 

Muhammad: Quick fix to the NHS on a lighter note, we should make the likes of Boris Johnson’s to pay the 300 million a week back into the NHS coffers, and may hold him accountable.

Muhammad: I’m sure everything will be sorted. Just a quick one on the process you mentioned Rob here. And I just recall once I drove around about seven McDonald’s and three KFCs in my county.

Muhammad: I’m sure all of you have used those kiosks, you walk in and you can order from that kiosk and you don’t have to talk to any human because, I don’t want to speak to anyone and I will just be like, pick my seat and one robot, supposedly a human, will walk to your table with a fake smile and, throws your tray on the table and, eat and bugger off right?

Muhammad: I intentionally wanted to change from a meal and on six McDonald’s and one KFC I failed [00:49:00] because they didn’t acknowledge my request. They have this process and this kind of mindset. You’re not this can’t be changed Chaos is not programmed even though and i’m sure Giannis can confirm It’s very easy to program that feature.

Muhammad: You can tweak a basket. You can edit it but not that feature available on that. I tried, there was like last try, I was just probably very close to Milton Keynes. I walk into the KFC and I did this intentionally, ordered something and then I walked to that counter and I asked this lady, can I, can you change that?

Muhammad: And she with a smile and she said yeah. What would you like? And I said I can’t drink this. Can I have a Diet Coke? And he said, yeah, with pleasure. And I was like shocked that she’s saying this with pleasure, because this word I didn’t hear from the six prior McDonald’s, right?

Muhammad: And then we started chatting. And when I realized that this person actually, he’s talking to every customer who’s walking to pick up., the environment in McDonald’s or KFC, it’s it became very lively and everyone is like cheering. It’s not some people are just standing in the [00:50:00] queues and just, it’s like an experience everyone wants to get over it quickly.

Muhammad: When I walked out, I looked at the reviews and I was shocked that this place as opposed to others have very decent reviews. And I actually left a good review as well because I felt human connection there. And on the other side, what we all know, and I think probably Paul or Neil, you mentioned like you pick up the phone and talk to, let’s say, O2 network, it’s minutes and hours on wait.

Muhammad: There was a issue two months ago when I had a dispute with O2. So I was spending considerable time on my phone. And on average, once I had, wasn’t put on hold for an hour and a half, but then I thought, okay, let me do something about it. So I tried to dodge the systems, this automated human, our friendly O2 bot trying to help us.

Muhammad: So I tried to give wrong answers. And when I realized after fifth time, the more answers I’m giving wrong, the quicker I’ve been put to the. Then I started doing the same. So the other day I had to talk to BMW guys. And first time on 19 minutes, I said, no, I don’t have time. I canceled the [00:51:00] call. Redial it. What’s your security, digits wrong. What’s your data birth wrong. Thank you. We’ll connect you to a person off the go. Hello. How can I help? 

Muhammad: So that’s, the workaround I’ve found. But again, why I’m doing this, dodging the system, shouldn’t do it, but I need to talk to someone. I cannot have somebody telling me, press one, press two, press three, and you miss putting a star next to your NI number and it throws you back in the queue, right?

Muhammad: And I think we are not ready for this as humans. I like what Rob mentioned about processes, polish the processes, then replicate it. There are great mind like Giannis and the great developers sitting out there, they can code wonders, give them a benchmark that guys, we need this.

Muhammad: And I think what Paula mentioned earlier, garbage in garbage out. And it’s not necessarily we have to put in garbage, we can put, some lovely rose flowers and daffodils and whatever, and in the end it becomes a bouquet. That’s all we need. 

Neil: At the start of the [00:52:00] conversation, I talked about light and dark, and I think that’s always going to be the case with humans, right?

Neil: I absolutely recognize some of the things that Rob is talking about, and this drive for profit and, at any cost. But if we think about That is going to max out and it probably is already starting to in some organizations. And I think there are two things that have a real impact.

Neil: So the wellbeing or, and the burnout of the people in the organization, I don’t think it’s sustainable to carry on. We talked about more with less last time. I don’t think it’s sustainable to continue in the way that we’re driving people to try and, we’re trying to reduce costs and increase efficiencies and so on.

Neil: And we say escalating burnout escalating effects around wellbeing, people leaving organizations, all those sorts of things. How do we fix that through the uniquely human skills we have that can engage with people, that can empathize and understand people, and really [00:53:00] Start to demonstrate concern and empathy and emotional intelligence that improves life at work and almost certainly at home through, through that human touch. 

Neil: Firstly, I think through being more human at work, having more time to be more human at work with emotional intelligence and so on, you build your employee value proposition. That’s the first thing. If we look at some of the early, the people going into work at the moment, just starting work for the first time, they want purpose, don’t they?

Neil: And they don’t want to be driven to that. We all want purpose. Frankly, we’re all human beings. We all want purpose in our lives. And I think the more we can’t see purpose, the more we’re likely to really consider what is our future look like. So I think there’s something about the important human skills.

Neil: That that improves well being of individuals and therefore employee value proposition overall. So the organizations. that are focused on improving EVP and being for [00:54:00] individuals are going to be more attractive to work for. That’s the first thing. We’ve talked about customer expectations. We absolutely want that human touch in our customer interactions.

Neil: I think customer engagement with organizations that are providing that human touch are going to have a a proposition. That those other organizations don’t have, and therefore will become more attractive. Where I think we’ve been hampered in the past is how do you reduce costs if you’re not, implementing solutions that reduce headcount.

Neil: And that’s where I think AI comes in. So if we can use AI in ways that allows, that frees up People’s time. We can spend more time with the people in our organizations, bringing that emotional intelligence, thinking about people’s well being, making sure they see real purpose in their work and improving their lives at work.

Neil: But we can also improve the experience for customers and clients. And I think what we’ve seen in the past, we talk [00:55:00] about service centers and things like that. Yeah. I suspect a lot of this is around what large consultancies offer. Large consultancies offer an answer or they present an answer.

Neil: And Paula talked about, behavioral change and human responses. It sounds appealing. Somebody’s got this answer. There’s five steps to your success. It’ll take 18 months and 15 million quid. That sounds quite appealing, doesn’t work, and it won’t work in future, I don’t think. And I, and so I believe the light versus the dark.

Neil: I believe the light will shine through in organizations will grow because they’re caring for their people. That improves EVP, people are more likely to stay or want to work there. And I think the human touch at the customer client side will actually drive demand through, through customer growth and those organizations ultimately, I think will grow and it creates more work purposeful work for people where they’re engaging other [00:56:00] people.

Neil: That’s I said that at the beginning, I’m an optimist, right? 

Rob: I can see that. I can see that when you look at the eight hour work week about when Henry Ford realized it was popularized when Henry Ford realized there was diminishing returns to working past that and it was more mistakes and that kind of thing.

Rob: So I can see it’s going to become more and more important where the early days of human resources were about help people be at their best. And I can see that there’s going to be a real shift to health helping people sleep better, all of these things that can increase their productivity, reducing work hours to maybe six hours, so that you’re getting the six best hours.

Rob: The part, the sticking block, Is the fundamental driver is always going to be in the current structure is always going to be greed and it’s going to be greed because and there can be justified if you look after your workers better, you’ll get more profit and you can be more attractive.

Rob: But the sticking point is we have fundamentally separated ownership of [00:57:00] companies, of public companies from the execution. And because ownership is amongst share funds, pension funds they don’t care, like they have to have a return, which is based on our greed, what we want in our pension, what we want in our share price So they don’t care about anything other than profit because that’s the way their business works.

Rob: And they’re not the ones who are actually dealing with the people. They’re not the ones who actually have to make it work. They’re just looking for who will bring the best reward. 

Neil: Yeah, I don’t see those things as mutually exclusive. If you care for the people, if they’re performing at their best because they managing their wellbeing.

Neil: If they’re motivated, if their bosses care, if you’ve got all those human interactions taking place in a way that creates safety, interesting work, all those things, you will be profitable. I don’t seem as 

Rob: just to give an analogy, I [00:58:00] worked in education for a while. And in education, it’s fundamentally about results.

Rob: And so the head teacher’s career depends on results. So when I went to school I was told, If you don’t get anything, I don’t care. It’s up to you. And the responsibility was purely on me. 

Rob: Later, when I was in schools, the responsibility was on the teacher, and the teachers were like, and the kids were like if I don’t get anything, it’s your fault.

Rob: And so the children had no responsibility for their own results and everything in a school judged the teachers politically because politicians says we’ve got bad teachers. We need to change, we need to punish teachers, we need to change teachers are lazy. And it is all because politicians get voted in by saying, I’m going to change education because there’s a problem with education.

Rob: And that’s because education is politically driven. It’s no longer about teaching them. It becomes factories where you train them. So year seven, they would come in, have a GCSE test. They hadn’t been taught the curriculum and it didn’t [00:59:00] matter because they were going to be drilled year seven when they came in the summer.

Rob: in year eight, in year nine, every time they were taking GCSE papers because they were being taught to, to reach a test primary schools drill for one month, one term, two terms just to pass SATs, because that’s how they’re judged. They don’t care about educating the child. And so the quality, what we learn is all driven by political dogma.

Rob: And in the same way, yes managers will care. What I’m seeing is there’s a tension between keeping my job depends on keeping the shareholders happy, but what the compromise is, I have to keep the employees happy. And so we’ve got two conflicting employees want to be treated as humans.

Rob: Shareholders want money. They’re never going to come into contact with the employees and they don’t know or care because they’re just a statistic. So what’s the bit in the middle is how do we [01:00:00] resolve this conflict? Yes, I agree that managers will care. Managers will learn that they have to care, but it’s a compromise solution.

Neil: I find this fascinating conversation because it’s interesting. But, one of the things that strikes me and people have talked about it but in effect, we, before we do anything, we need to create the right conditions in the organization. And for a conversation that is about the future of AI.

Neil: We are talking a lot about creating the right conditions. And I think that is very telling around the importance, actually, of what we humans need to do in order to achieve, value and success through AI. In effect, it suggests to me, this is not really about the technology. This is about the human dimension, the conditions we’re building as human beings.

Neil: To make the difference one way or the other and it’s human. It’s human emotion, it’s human cultures [01:01:00] driving the things that you’re talking about, Rob. Absolutely a standout thing for me is, really got to get that conditions, those cultures and behaviors, through in particular leaders but yeah, big challenge.

Muhammad: Rob is mentioning this, and this is my opinion here, AI can’t fill that cap, because there is a conflict for sure. I personally have been to certain such experiences where the shareholders demand your neck, your employees demand your love. find the balance, right? So you give them the neck and this blood will flow to them, right?

Muhammad: And they will find probably love. It’s a very hard balance. But if you are a determined person to what Neil, you’re saying, if you know exactly that you will not compromise your values, you can be a little bit diplomatic, you can keep Providing results what results are based on your team effort, right? So from that school of thought, you know Your employees are your biggest asset.

Muhammad: They are going to produce results for you so the more you engage with employees the more [01:02:00] empathetic you are with them the better results and your Investors are happy. However in certain Scenarios what i’ve also realized or found the experience is the bigger the organization get, The bigger, the wider this distance becomes from the ground to the top and the less empathetic you become when even you’re sitting in that boardroom. You can then use AI because there’s a tool, for instance, you it’s similar to chatGPT. You put a question there, okay, what will my P& Ls look like, from, as opposed to from last month to last year in the same period, and it will give you instant answer.

Muhammad: So You can make a decision based on data. So data is available within seconds. Literally you don’t have to call your accountant or anyone. It’s readily available on the screen. And now the next thing is, okay, why my cogs are higher? Why my labor is going through the roof? So we’re not looking at the families.

Muhammad: We’re looking at the numbers. Okay. Mr. XYZ, you’ve got 50 people in your team, you don’t need 20 of them because we can do this and you reduce the [01:03:00] head count by 10. In this five minutes of discussion and debate, we’ve reduced 30. people headcount, our P& L would look healthier.

Muhammad: Yeah, of course, we can raise more funds. So what Rob is saying, and what I mentioned earlier, it’s again, the human greed comes, it’s greed comes back, we want more and more. So unless we as individuals, we as individual leaders, look after our own greed, minimize it, filter those through to our team members.

Muhammad: A few days ago, there was a LinkedIn post, a very good one that there are specific soft skills we need to teach our kids. Absolutely. Kids are into these they’re far clever than we are at the moment. My, my nephew is probably four or five years old, knows more about iPhone 13 than myself.

Muhammad: So they know everything. But one thing which is lacking is they don’t understand the human connection. You go to again, KFC, right? They’re young kids, like probably 18, 16 to 18 or 20 years old working on the front. But they lack empathy. It’s not their fault. Their [01:04:00] managers, leaders didn’t teach them.

Muhammad: They’ve been brought in, showed the process. We are inadvertently making more robots out of these kids. It is a far bigger problem in my view which needs to be addressed. 

Giannis: It’s more different from my side about these interesting battle between having the shareholders satisfied and your team happy because I’ve been there three years ago.

Giannis: Because if you are an emotional leader in general, and you really care about your team not only about their emotional side or, but everything in general. In reality, this cost me kidney metaphorically speaking, but in reality, this is how I got my autoimmune syndrome because if you are a too nice guy about having everyone pleased about something about, priorities caring, empathy, everything is like the balance in the bottle is super tough to manage it.

Giannis: At least this is how I felt it back then. Now I [01:05:00] found my balance, et cetera. And I think my personal secret is to say more no’s, not to other people, but mostly to me. 

Giannis: I think if AI is going to replace a lot of stuff and daily tasks, et cetera organization will have more time about everything.

Giannis: I’m also an optimistic person, but because business are super money driven. I don’t know if they’re going to invest back those money from AI generated tasks to employees, or they’re going to keep it as profit, meaning that I feel that they’re going to fire some people because they are not needed anymore.

Giannis: Instead of, investing back in them and they, being six hours days or four days work week or, more flexibility with remote working, etc. More leaves, everything that sounds really, perky in general and not benefits like ping pong tables, free fruits and coffee with I’d say [01:06:00] politely weird. 

Giannis: I’m from this side of AI in companies and in organization in general. I’m not so optimistic but because I dislike the generalization point of view. I feel that there are a lot of companies out there that, really care about their employees being burnout to predict it, to find it out, to prevent it, which is most more important than even predicted. And I thought those companies will be a compass for other companies. Not just to copy, but mostly to help, this working behavior that, I know that this works so I can really, implement it, started implementing it to my own company.

Neil: I think we’re in a very interesting time, aren’t we? 

Neil: I think it was Accenture reported last month. They’re estimating around 40 percent of working hours could be augmented by AI. That’s that’s guesswork, obviously. But, born out of some research. So we’re in a, we’re in a very [01:07:00] interesting time.

Neil: But If we thought about 40 percent of savings in organizations although that is actually augmented work, so that’s human AI interaction there’s going to be some big decisions to be made. Isn’t there by companies that are driving the. that sort of, industrial age ways of working to drive up profit.

Neil: There’s going to, really quite big political and social questions that are raised as a consequence of that huge saving. If everyone ends up walking out the door, what does that look like? 

Rob: I think the big change can’t come from companies. I don’t think it will because muhammad said it earlier, we become slaves to technology, but we’ve also become slaves to the system, to the structure of societies that we’ve created. 

Rob: It’s us that chooses the pension fund that brings the most return. It’s us that chooses to invest our money where we get the most return and in doing so we become the boss that we hate that’s driving us for pure greed and doesn’t [01:08:00] care about us.

Rob: And so I think fundamentally we have to look at it as citizens. It’s when we make more conscious choices, when we stop buying child labor, slave labor, when we stop perpetuating and it’s all because We have a societal operating system that’s based on greed, but it’s based on that we prioritize money over everything else.

Rob: We might say that in our individual choices, we don’t, but when we vote for the politician that says that they’re going to lower taxes, when we’ve put our money into the hedge fund or the mutual fund that brings the most return we’re doing that and there is I agree totally, I am ultimately optimistic Neil, but I think everything you say is right, I think we are reaching breaking point, but it’s burnout, it’s the frustration with unhappy emotional health, with unhappy relationships, with unhappy work life, that’s ultimately gonna create it. 

Rob: [01:09:00] It’s the, the great resignation and the quiet quitting that are the things that, you know, Eventually, there’s so many people who believe in corporate.

Rob: What we see on LinkedIn is there’s so many of us that have opted out from day to day working for someone to be our own boss because we want that autonomy. We are the cause, trapped in our own system and I don’t think any of us can see the way out, but it will happen when you look at how society changes.

Rob: It’s when we decide where we put our attention and value. 

Rob: This is, this has been a fascinating conversation, but I’m aware we’re overrunning now. Thank you for everyone for being here.

Rob: I’d like to finish these with just going around what are your thoughts, feelings, anything that came to mind what comes to mind, anything shifted. 

Rob: For me, I came into this and I was thinking about AI, I don’t really know that much about the technology.

Rob: Okay. I can see it’s going to make some changes. What I see is you can’t add technology to a system that doesn’t [01:10:00] work. And that’s very clear that we have to be much more clear. The reason we’ve created so much burnout, so many problems for ourselves emotionally is because we’ve overvalued the whole economic, we’ve been manipulated from our food choices, like food manufacturers is all about how much sugar can we get to make people addicted to certain foods, drink, gambling, all of these industries are built on human weakness. And we need to change that mentality.

Rob: Before we scale it because otherwise we’re just creating more social problems. I think 

Neil: at one level, I think there’s a lot of what we’ve talked about, there’s a lot of systemic issues. that are driving, behaviors of today and systemic issues at a sort of micro level that the implications take a long time to resolve.

Neil: So there’s something about the culture that we’re in at the moment, if I generalize around this sort of, industrial age. There’s a lot of systemic issues that are a consequence [01:11:00] of the way in which a lot of organizations work. I think we need to start addressing that.

Neil: That won’t be fixed in, in, by next week. And I think the way we fix that is through the recognition of what makes us human. The human traits that drive those systemic issues, why is the environment is such a bad state? Quite often, politicians are making short term decisions for Whether or not we’re going to get voted in at the next election.

Neil: And we need long term thinking to address issues like the environment. And I would suggest like AI. So lots of big systemic issues. The only way we’ll fix them is through humans, I think within work we see the burnout and the well being issue. The way we address that is through humans, human empathy and caring and emotional intelligence that really is concerned about the well being of of our colleagues and other human beings in the workplace.

Neil: Fixing that is a human job. If we fix that, we might improve Employee value [01:12:00] proposition become more attractive company. We’ve talked about human engagement at a customer level. We fix that as through humans. I think a lot of what we talked about requires human beings to be uniquely ourselves.

Neil: The things that AI can’t do in order to start addressing the systemic issues, the wellbeing, the improved, customer service. How do we change the system? We need more time. We need more time to be human. I think we, we let the machines do the mundane to free our time, to allow us to address the big issues or the human issues.

Muhammad: I think AI is the future. Being a very optimistic person, I do believe AI has tremendous benefits for humanity, as long as we, the gatekeeper of AI, tend to be more humane in our approach. And I think I still believe that we create a demand, we create a vision that this is what we need by a minority and we majority just start working [01:13:00] towards them.

Muhammad: One example of being bought these electric cars. I have no issue with that. I do have an issue. Come 10, 20 years later, where are we going to dump those batteries? If we are saying about the eco friendliness, nobody’s talking about this. Everyone is talking about the short term benefits. And I think this is again, should be our own conscious choices.

Muhammad: And one of the thing which I do believe in is creating my own legacy for my generation to come, whatever I will be facing now, at least they won’t faces, they have clarity in mind. And the best thing is to teach them how to be human, regardless that it would be AI or MIS, XI, whatever will come, in future, we just have to adopt it.

Muhammad: And Use it towards our advantage rather than making it thinking it’s as a competition, make it an ally. 

Giannis: I’m also an optimistic person. So I think AI is like the past, the present, the future. But at the same time I feel like we are in a hamster wheel that we have created, [01:14:00] but, and we are trapped inside this.

Giannis: And we are running to be on time because AI is like is moving super fast. On the other hand, it’s some that will never change because AI is changing all the time. Some that never changes like human emotions. And it’s like behave humans as humans. I think it’s like the bare minimum in organizations in leadership, in everything.

Giannis: And if you really want to be, let’s say a leader that, you wish you had, for example, you don’t have to be, to master AI, but you have to master the human psychology and human emotions AI is something that can be taught like every other skill or something.

Giannis: What I really like to say is don’t try to be a carbon copy leader of advices and AI, what you read. On or what you hear, but, be the best leader version of yourself. And go out there and be unique.

Rob: That’s a [01:15:00] motivating call to action. Great words to finish with. Thank you all for being here. It’s been an enlightening and Interesting and thought provoking conversation.

Muhammad: Cheers. Bye bye. Bye.

Share the Post:

Related Posts