Today we have a treat in store for you, as we step into the fascinating data-driven world of Phil Harvey. A self-confessed beardy data geek he has worked in a wide range of industries and currently his interest lies in the impact of data on what we know and how we know it.
[00:00:00] – Phil Harvey
I don’t see myself as somebody who has a particular purpose. I tend to try and work out what my vision for a particular situation is, whether that’s work or life or music or anything, and then say yes to things until stuff starts to happen.
[00:00:24] – Doug Foulkes
Welcome to The Future of Work, the podcast that looks at, unbelievably, the future of work, it’s brought to you by WNDYR for their blog Chaos and Rocketfuel. WNDYR are productivity and human behavior specialists who use technology to help us humans on our digital journey from disruption to transformation.
[00:00:44]
Check them out at WNDYR dot com. That’s WNDYR dot com.
[00:00:49]
I’m Doug Foulkes. And along with WNDYR CEO Claire Haidar, we regularly meet up with industry experts and mavericks to get their take on work in the future. And today we have a treat in store for you as we step into the fascinating data driven world of Phil Harvey. Phil is currently senior cloud solution, architect for data and AI At Microsoft.
[00:01:12]
A self-confessed beardie data geek, he has worked in a wide range of industries, and currently his interest lies in the impact of data on what we know and how we know it. In this deep dive into data, you will discover the difference between ethical and unethical AI. Robotics, AI, automation, and machine learning 101 and what work in the future will be like when the robots have arrived.
[00:01:38]
But first Claire let’s find out what fascinates Phil Harvey.
[00:01:43] – Claire Haidar
Phil, in doing research for this episode with Doug and myself, I have only got one word to describe you and that is fascinating. From bearding to Japanese to the culture of the Japanese culture to the actual language all the way back in time to CAD drawings and then circling back today to music and writing. I’d love to start the conversation by understanding what your fascinations are. Like what grabs your attention and holds it for a really long time.
[00:02:14] – Phil Harvey
It’s a great question because the word, really long time, is the important bit there. You may notice I like to be interested in lots of different things, but the thing that’s held my interest for the longest time is philosophy, as an underpinning for real life as opposed to sort of a head in the clouds kind of approach. And so through understanding Japanese culture and the philosophy there, through to writing stuff down, because philosophers need to do that to even thinking about music as a different way of expressing yourself and working out how that fits into normal life as somebody who’s not a musician, who’s somebody who just enjoys to do that.
[00:02:56]
But underpinning it all, I read a lot of philosophy, and I find it fascinating to see how people work like that.
[00:03:03] – Claire Haidar
Do you have some favorite philosophers?
[00:03:06]
I’ve got some favorite quotes and I’m picking up new philosophers all the time. So my next on my reading list is Mary Midgley, who I’m really keen to get her perspective on things. But the current quote that I tell everybody about is from Søren Kierkegaard. And it’s in his advice about procrastination. And he says a person must do what they know is right when they know it is right or the knowing simmers down. And this idea that what you know to be right, once you know it, you know, you won’t know it instantly, for example.
[00:03:43]
But if you don’t take action at that point, your ability to believe in what you know diminishes.
[00:03:50] – Claire Haidar
Profound and highly relevant to the current world climate that we find ourselves in.
[00:03:56] – Phil Harvey
Very much so, I think, yes.
[00:03:58] – Doug Foulkes
Phil before I get stuck into my more serious questions I’ve got, I read that you call yourself a beardie data geek. And I just want to know, is that different to a regular data geek?
[00:04:11.290] – Phil Harvey
Well, yes, in that I have a beard. What is it my mother always used to say, if you can’t be interesting, be memorable. So the beard helps with that.
[00:04:25] – Doug Foulkes
My cousin in Belgium used to say, if you can’t fight, wear a big hat.
[00:04:31] – Phil Harvey
Somebody was giving an example where sometimes they said you need to hit them with the fish then give them the fish and then hit them with the fish again. And I was like, I’m not quite sure what that relates to, but I like the analogy.
[00:04:43] – Doug Foulkes
So moving on to my first very easy question, what do you believe your purpose is? What is your contribution that you’re making to the world and what is the legacy that you want to leave behind?
[00:04:54] – Phil Harvey
So that’s really deep.
[00:04:56]
And philosophically, I have a challenge with the idea of purpose overall. That’s a philosophic position in itself. But when you talk about legacy, I think that’s easier to answer. So I don’t see myself as somebody who has a particular purpose. I tend to try and work out what my vision for a particular situation is, whether that’s work or life or music or anything, and then say yes to things until stuff starts to happen. So I don’t worry about the midterm, it’s usually the immediate and the very long term.
[00:05:29]
But in terms of legacy, I think I’m privileged to have a platform in my work and being able to write a book and these kind of things. So I really hope that, you know, when it comes time for me to leave things behind, people will remember that, you know, I used my platform, my privilege to be helpful to the world as opposed to squeeze out of it every drop of money or personal fame or whatever it was.
[00:06:00] – Claire Haidar
It segways well into the next question that I want to ask you Phil, is in talking to Lisa Talia who was one of our very early guests on the podcast show, who actually referred us to you.
[00:06:12]
And I know that you guys, you know, bump into each other quite often at a work level. In my conversation with her about you and then also in my research in preparation for today’s conversation, there’s a theme that runs through a lot of your writing, your musings, and it’s the theme of calling humans away from electronics, machinery screens. And on the flip side of that, there’s a call to become makers. That’s very much related to what you’ve just shared about, you know, being useful in the world.
[00:06:48]
Why is this a theme in your work?
[00:06:51] – Phil Harvey
It’s an interesting one because of my personal journey through this. So I was a professional programmer for 15 years and CTO of a tech startup and, you know, believed the technology thing, as it were. The more experience I’ve got there, the more it was the CEO of the startup that I helped found who said technology is the easy part. It’s people that are difficult. And, you know, as a technologist, you never really get that, you know, code is hard and you work on that.
[00:07:27]
But the more experience I had with technology, you know, I’ve cabled onto the floor. I’ve done CAD as you mentioned, drawing buildings and things like that. I’ve done IT support. I’ve built software, I built software companies. And now I work at, you know, one of the biggest software companies in the world. And I find the technology becomes transparent. And when you look through it, you see people, either people building it, or those kind of things.
[00:07:52]
And I think people need to lose their obsession with the addictive behaviors, whether that’s news or games or those things, and start to see the people through technology. So there is a wonderful tool there to use. I mean, you mentioned music before and the ability to make music on computers and electronic devices has never been more possible for everybody. And the ability to share for free through the Internet and have people listening to it is never more available than it has been now.
[00:08:26]
But if you do that from just the technology perspective, you’re not really thinking about people. And so, you know, make music for people to listen to in the same way that you should make technology for people to use and enrich their lives and make it a platform that gives people, you know, access to human flourishing instead of you know, traps them somewhere.
[00:08:52] – Claire Haidar
We’re going to circle back to that. I want to pause on the commonality between music and code. While I was preparing again for today’s podcast, I actually put some of your music on in the background and your piece of music entitled Proteau, but specifically the segment in its spring version two, is a piece of music that I really enjoyed within the numerous creations that I listened to. And it really arrested my attention because I kind of just had it, you know, playing in the background and the other music was kind of just there subconsciously.
[00:09:30]
But when that specific segment came on, I actually stopped and I was like, oh, you know, this is a really great combination of what you’ve put together. And it had a happiness to it, a lightness to it, which, you know, Spring alludes to that. My husband’s of the opinion that some of the best coders in the world are musicians.
[00:09:50]
He’s worked with a number of them over the years. And he interestingly, in looking back on the multiple teams that he’s built, has found this correlation. And what do you believe the correlation is there, if there is any?
[00:10:07] – Phil Harvey
We can play with the words there right, because correlation is not causation in each of those pieces, I’m going to jump from correlation there to talk about the causal link between these two things.
[00:10:17] – Claire Haidar
OK, good.
[00:10:22] – Phil Harvey
I think this idea of creation where when you’re using, if I was to put a causal link in place there, when you’re making music, you have to use tools. And that’s either from the tool of your voice through to musical instruments, through to digital and electronic devices.
[00:10:42]
And you have to learn to work through those tools and. You know, it could be a violin, it could be a synthesizer in any of those pieces, and there’s a similarity there with how you produce software technology with programing or you build physical devices for the internet of things or any of those things.
[00:11:05]
And I’m not sure whether I’d say. That there is any quality comparison there so, best coders are musicians or whichever. I would have said that this idea of creation, when people open them up to different places to do that, you may have painters, builders, artists, makers, coders, musicians. It’s this idea of craft that links them all together. They want to create something through tools. And I can see why your husband gets that parallel from. And I think it’s a great connection to make.
[00:11:39]
But what you find is, you know, people who want to be creative will find routes for that. And if one route is not satisfying enough, they’ll try others. And music is so different from visual mediums and programming that it’s a really good avenue.
[00:11:56] – Claire Haidar
Phil, turning the conversation very much to the world of work.
[00:12:01]
It’s one that you inhabit on a daily basis working within Microsoft.
[00:12:06]
It is also one that your book covers extensively and it’s one that we’re the most passionate about. It’s what our entire existence is about within WNDYR and what we get really excited about.
[00:12:21]
I’d like our conversation to really delve into some of the practicalities of the future of work. I would like our audience to be able to walk away. Having listened to this conversation with some tangibles. There’s a common theme that runs through conversations with them, and that is this whole concept of AI And the future of work and automation is very ethereal. It’s not something that’s tangible and there’s a lot of scaremongering that’s happening around it.
[00:12:55]
Can you paint a picture for us of what a typical robotic human workplace will look like and also what it will feel like and function like in, you know, a few timespans? So let’s look at maybe three years, five years and then 10 years.
[00:13:12] – Phil Harvey
Interesting to put those time scales on it.
[00:13:15]
So I fully believe that the job of technology is to encourage human flourishing both in life and work. So if you take the longest time scale, first 10 years, we have the opportunity to reject, redress some work-life balance issues and especially dirty, dangerous, unfulfilling jobs of different forms with these technologies. So if I were to be optimistic as soon as 10 years out, we could start to encourage a world where people are being creative, more artistic, having more time on their hands to do things that they find fulfilling, whether that’s in work or outside of it.
[00:14:00]
And the technology can fade further and further into the background. And people can get on with living lives and there’ll be people in there who want to create technology and push the boundaries, as well as those who want to have the social side of life and not feel pressured into, you know, the box of a nine till five that they don’t enjoy just to buy the right to have a social life. So that’s kind of my optimistic view. There are pessimistic views of what’s possible in that time and in the world at the moment, everything from COVID to political situations around the world can point us in those pessimistic directions.
[00:14:36]
But I think what motivates me towards that is to think about how good it could be in the future, especially as I have an eight year old son and I want the world to be better for him. If we pull that back right down to the sort of three-year mark and I mentioned this before, that I tend to think about a vision and then think what I can do in the short term so I might have more trouble with the five-year timeframe.
[00:15:01]
In that three-year phase what we’re going to look at now is where more people in more roles learn how to use technology in their jobs. I’ve just before this recording, for various reasons I was watching videos about people who distill alcohol both professionally and as a hobby, and the discussion there of whether they should use something called a PID controller. And this is something which, you know, is a piece of technology has been around for a long time. But the debate is still raging about should they be automating things with proportional integral derivative control of temperature and stills.
[00:15:45]
Now, this is the world that we’re in right now. It’s all about the application of these technologies to new areas and allowing people and enabling people to learn new skills and new ways of doing jobs that they care about. And so I think now and in the next three years, we have to make sure we’re focused on that. So people don’t feel like AI is scary. It’s something that’s there for them that they can embrace in their work to do more, as it were.
[00:16:19]
I think the battles then come between those two points because there are people who haven’t accepted the paradigm of change, these people who like to think about things as they were, we see this politically, we see this culturally, and that’s a challenge we’re going to have to work through as a society. I don’t think it’s specific to AI at all, but I think if we’re going to go for optimism, we’re going to have to think about people who are enhanced in that capability with these technologists.
[00:16:52]
So we wouldn’t expect a delivery driver to be carrying around a pad of paper at the moment and managing in that way. They all have electronic devices. Now does that make their work better or worse? I’m not sure. I think that the system they’re part of will control how they feel about their work. But we see these changes happening. And I think people need to learn to expect those and feel comfortable with embracing technology to make their life better, as opposed to the things I mentioned before, these addictive behaviors which kind of trap people in the technology.
[00:17:28] – Claire Haidar
Expand a little bit on the piece that you started with there around, I think you used 3 D’s: deep, dark, and dirty tech that currently exists today. Can we get a little bit specific about that? Like, what are some of the practical examples of where that exists and how it’s hurting humanity as a whole, but specifically in the workplace?
[00:17:56] – Phil Harvey
So those D’s as you mentioned there, I was talking about particular jobs that are necessary. For example, I think they can be applied to technology, too, though. So I think a good link point between the two is planning end of life, digital and hardware devices, and things like that. We’ve both got in the world people who are living in situations where they see the best way to make money is to burn plastic coverings off metal, to get to the precious metals inside, to get into the scrap market.
[00:18:32]
That’s not good fulfilling work that’s dangerous it’s dirty, and it’s not something we should be encouraging, but it happens in the world for particular reasons. And I also think you have the equivalent in tech of people who are online content moderators. There are some very psychologically damaging work that is happening now and people are living and working under conditions which if you look at them objectively, you wouldn’t consider good in any way. But they have to do it because the work is there and that’s the money they’re being offered.
[00:19:13] – Claire Haidar
So let me summarize what you’ve just said, because this is a critical piece of this conversation. Am I hearing you in that you believe that it’s critical over, you know, we gave it a three, five, 10 year time scale. It could be different, but let’s call it the short term three year initial period. You’re saying that in order to get over these dirty, dark elements of jobs as well as technology, that people really need to determine for themselves where they are on that change spectrum and the acceptance of it or the not acceptance of it and actually verse themselves in how much technology can do for them.
[00:19:55]
And if enough of a mass of people start approaching technology in that way, we will as humanity be able to move into a place where technology can be viewed as a tool to improve and enhance life rather than break it down.
[00:20:15] – Phil Harvey
Very much so. And I really like the way that you put that, because the change that needs to happen is people need to ask what is this technology able to do for me and demand more as opposed to what is being done to me by technology.
[00:20:29] – Claire Haidar
That right there, that’s powerful.
[00:20:32] – Doug Foulkes
Phil I’m going to jump here and
[00:20:33]
I want to stay with AI. I know that you are an advocate and we hear a lot about ethical AI. Is this such a thing as unethical AI and it might be also part of what you were just talking about. And if there is, what would that look like specifically in a work environment?
[00:20:48] – Phil Harvey
So I think there’s a couple of types of unethical AI that people need to look out for. There’s the kind of accidental, unethical AI which is limiting to people, where people are not treated fairly. So this could be they’re not represented in a fair manner in the data that’s used to train a supervised learning model. So there’s a kind of classical example of this in machine learning, which is in finance. There are twenty years worth of loan applications stored as data.
[00:21:30]
The trouble is those loan applications were reviewed and filled in by people, and within that 20 year period, you have a changing socio sort of socio-ethical landscape of how people view different groups in society. So if you use that data, you can accidentally bake into the AI model that is then presented to make the decision, all of the human bias that was part of that process in the past. So that’s sort of accidental, unethical AI. Then you have the kind of strange, proactive, unethical AI, where people are trying to achieve a goal and decide to ignore or actively, what’s the word, they actively put out of their mind, the unethical implications of what’s happening.
[00:22:25]
And this can be for political gain, for security gain, for all of these pieces. And so they actively subvert a system on purpose. And we’ve seen that discussion play out with uses of data from social media where people are actively saying we can do this so we will, as opposed to we can but we actually shouldn’t. And you’ll see discussions of facial recognition technology in here where the discussion was around the use of that in police forces, where it was used because it was effective, where actually it shouldn’t be used, because it is contravening, you know, a privacy right.
[00:23:06]
So there are some examples of both accidental and kind of proactive, unethical AI. But I think the bigger point here is that the discussion is about the ethics of AI and how we approach these discussions to make sure that discussion is inclusive, not just sort of pick apart the bad things that people did.
[00:23:28] – Claire Haidar
Doug, just before you move on to your next question there, I just want to interject here and share a thought. Phil as you were talking, I love that example that you gave of the financial modeling. A company in the US called Ellevest specifically focused on woman, like that is their whole mission is because they’ve actually looked at the data and the AI that is built into stock market investing and the biases that are built up around that because of the exact thing that you just shared and so they’ve created this platform.
[00:24:03]
So my question to you is, because we have that accidental, unethical AI element in society because of our past and the bad things that happened in the past where we didn’t have that equality across the socio-economic spectrum, are we essentially going to have to evolve ourselves out of that by doing things like Ellevest have done, where we’re essentially creating a different type of segregation for these so-called ignored socio-economic groups?
[00:24:35]
Or is there a way to address that bias without creating new biases?
[00:24:41] – Phil Harvey
So thats a really big question, and I love it for its size, and that’s the work that we need to do. So sort of one person with my opinion, I think there are things we can do about using simulation over data, and that might be easier to balance a simulation, even though simulation is difficult in itself than it is to trust that we have the data.
[00:25:09]
And I think you’re right that what we need to do is to learn to navigate the biases and understand what is in the data and what is happening as a result, which takes a bigger kind of systems thinking than just being really good at machine learning. And so as each of these things come up, as each of the examples come up, we need people, you know these digital sociologists. You mentioned Lisa Talia before and as a digital sociologist it is our work to help us as an industry understand what the ramifications and impact of these changes are.
[00:25:45]
And then we’re going to need to learn how to deal with the ones that are accidentally not what we intended.
[00:25:52] – Doug Foulkes
Phil I know that your personal mission is to make data empower everyone. So maybe could you just enlighten us on some recent AI developments that you’ve come across that could maybe help do that, whether they’ve surprised you or confounded you or even scared you?
[00:26:11] – Phil Harvey
I’m trying to think of one that scares me when you ask that question because technology is so transparent, you know, you kind of see what’s going on, the things that scare me are things like autonomous weapon systems. And, you know, the idea that a person would go, oh, well we’ll make a machine decide who to shoot. That’s terrifying in the world.
[00:26:34]
And then the kind of surprised and confounded is this manipulation of information that is happening across the world where, you know, through to news, to what’s online, the people are learning to manipulate and disrupt in terrible ways the information exchange that we have with the world around us.
[00:26:59]
You know, when I was growing up when we first got the Internet in the west country in the UK, the main lesson was don’t believe anything you read online, that doesn’t seem to be the case for many people. But then I also see things that, as you said, that sort of delight me and make me really happy and there is that full spectrum where you go, yeah, we can build models and systems, models and understand how to, let’s take sustainability for example, create the circular economy through technology.
[00:27:40]
To build models and build artificially intelligent models that help get to zero waste both through human intervention and the interaction of businesses in different ways, those kind of possibilities really delight and excite me.
[00:27:54] – Claire Haidar
Expand on that a little bit.
[00:27:56]
What does that look like at a practical level? It’s something that there is a lot of articles being written about it and that. But can you again give us a very practical example of what a circular economy can do inside, for example, a neighborhood or a single school?
[00:28:12] – Phil Harvey
So there are different granularities to that.
[00:28:15]
There’s something in Denmark I think is known as a symbiosis of companies where a company can’t set up in that area unless it produces useful waste for another business and consumes the waste of a business that already exists. So there’s, you know, in chemical processes, you have people who need to grow enzymes for particular things and that takes a certain temperature of liquid and those kind of pieces. So any business that uses heated liquid and that comes out as waste can be passed on to the next business to help maintain that cycle.
[00:28:54]
And I think schools and communities is a good example where you could look at a skills perspective. And this kind of happens on a small human scale with kind of parents coming in to talk about their work or guardians sharing their experience in particular ways. You know, coding schools in libraries.
[00:29:13]
Those small scale things can be really amplified by AI technology because it can search a much wider space. It can look at the systems in much more detail than we can afford humans to do. To start to look for new connections between people to allow that skills transfer. It could be anything as simple as, you know, apprentices who are having challenges in COVID now, being able to find new roles that will help continue their education and professional development that they might not have thought of before.
[00:29:52]
This kind of circularity where you take somebody who’s been rejected by one system and plug them into another system, I make that sound kind of, you know, robotic and mechanical. But that’s what we’re looking for, these places where the loops can be connected up again.
[00:30:11] – Claire Haidar
I love that because that it comes back to the first part of this conversation where, you know, one of the things that you’re passionate about is calling people to be makers. This is a classic example of it, it’s not just me making a house for myself. It’s not just me, you know, singularly within those walls. It’s me within a neighborhood and how does my presence within that neighborhood impact the neighborhood? You know, it’s starting to think beyond the self, which is something that the world just needs so much of right now.
[00:30:45] – Phil Harvey
Yes, it sounds like we’re of a common mind in that regard. What worries me is that parts of the world and communities in the world don’t see that. They see it as being individualistic is the goal. But I don’t see that as working, in the same way that we’re in a climate emergency because there was a, you know, socio-religious belief that the world was there to be exploited. It was a gift from a creator to humanity.
[00:31:16]
And humanity could exploit that as they will. That’s not the case where, you know, Earth system science or Gaia theory as it will, is a more realistic approach to going, we’ve got to account for the ecosystem services that we use and the human impact of what we do, as well as how we grow value, you know, in business and, you know, encourage human flourishing.
[00:31:42] – Claire Haidar
One of the very small ways that I’m trying to instill this into my two kids and I also like you have an eight year old that’s just turned nine yesterday and it’s something that I learned from my grandmother where she said, it doesn’t matter where you go, always leave the place that you’ve been in better and at the absolute minimum, the same that you found it.
[00:32:11]
And, you know, like small places where I’m trying to instill that into my daughter is, that extends even to like a public restroom, you know, like so many people leave those types of places in shambles because they just trying to get out of there. But, you know, don’t leave toilet paper lying around, you know, pick it up, make sure it’s in the trash can. Like, don’t leave the toilet seat dirty, you know, like things like that.
[00:32:37]
If you splashed water all over the basin, wipe it down, you know, make sure that the person coming after you has a good experience and they may never know that it was you doing it. But, you know, in those small, tiny ways and I think I think that’s where we’ve got to start because the concept of a circular economy feels so big. But if we can break it down into these very bite sized chunks of what that actually looks like, it’s actually very basic, simple what does it mean to be a good human things?
[00:33:11] – Phil Harvey
There’s two stories you remind me of there which, one of them remains in public bathrooms. But I’ll start with the other one I was discussing with my wife last night. It was a Japanese expression that I can’t remember now, but essentially if I translated it, it came to, you know, leave the flowers, but take your rubbish. As an idea of being out in nature where, you know, you’re leaving behind the things that are there for other people.
[00:33:38]
You’re taking your impact with you. And it was when, so I’ve recently just had a book published Data: A Guide to Humans, and it’s talking about something called cognitive empathy in data systems. So essentially crossing over these sort of soft skills for data scientists, if you will. And there was an example that didn’t go into that. So, you know, a little extra thing for people, when you’re in those public restroom scenarios, you’ll see a sign that says, take a moment to think about the person who’ll use this after you.
[00:34:14]
And we have that sign in our life exactly to the point you made. But we don’t have that sign in data systems, in technical systems. And so many people don’t. They leave their, let’s call it, they leave their rubbish, their emotional baggage, their paradigm in the system and they don’t think about the next person who’s going to use it or going to need to use it. So we asked people to do this in completely sporadic ways and just as you’re saying, in terms of how we instill this in our kids, we don’t have the right kind of education in place to teach the rational side of cognitive empathy.
[00:34:54]
We kind of tell people to be empathic and emotionally empathic, but we don’t train them in how to think about other people’s perspectives in the right way.
[00:35:02] – Claire Haidar
OK, bringing it back to what this whole conversation is about. Robotics, AI, automation, machine learning. Again, this is something it’s these big words. You know, they’re thrown around haphazardly at conferences and in boardrooms. Give us the 101. What is the difference between robotics, AI, automation, and machine learning?
[00:35:25] – Phil Harvey
What a great set of words to pull together. So automation, I think, is where this starts. So automation is taking a process that is not happening automatically and making it happen automatically. Now, interestingly, though, if the process exists, it’s probably being done by a person and I was talking about distilling earlier. So when there are people who are extremely skilled in brewing alcohol, distilling alcohol, and they throw wood on the fire and they can judge what’s coming out of it.
[00:36:00]
Using a controller and the PID controller as mentioned, allows part of that process to be automated. And so automation itself is the insertion of technology where there was a human. Let’s keep it as simple as that then. Then robotics, robotics has become a bit muddy because in the simple terms, it’s the mechanical device that does the automation. So think a robotic arm in a factory that does a particular process. But with this idea of robotic process automation and that being largely done in software, people will hear the terms in different senses, not just that hardware machine that does it.
[00:36:45]
But I like the fact that it’s kind of challenging as a word. But the original word was a Czech word for a mechanical slave, as it were, a device that did your bidding. And it, you know, robots and robotics, the study of it if you look at, you know, the fancy videos on YouTube of Boston Dynamics and those things, the skill of that is enabling the robots to do more and more things. The idea of using that robot to do things is still as it ever was.
[00:37:22]
And so the fact that it can now independently walk across rocky ground or jump up to a higher platform. Is just enabling that physical device to do that. Then we need to switch to machine learning, and machine learning is the process of distilling software from data. So you derive a piece of software which is often called a model from a set of data as opposed to a piece of software generating the data for you. So let’s say the software is social media.
[00:37:57]
You go onto a Web page, it tracks you, it tracks what you click on and all of those things that generates a data set. That is software-derived data. If you take that data, you generate a model that decides what ads to show you next. That model has been derived from the data. And then if you think about the financial examples that we were talking about earlier, where somebody takes that data set about loan applications and builds a model about who should or shouldn’t get a loan.
[00:38:28]
That’s machine learning, sort of in a simple sense. When you take any learning process, whether that’s learning from data or learning from a simulation which you also talked about and produce some kind of model or agent or piece of software, if you will, that does things automatically for you, that’s artificial intelligence. Now, pulling all of those things together, you can now imagine the machine that learns from data, from its sensors, from the cameras about the faces it sees, that’s all embodied in one robot that can now walk across rocky ground and jump up onto higher platforms and decide what to do completely autonomously.
[00:39:20]
So it’s a bit of a winding, circuitous route with lots of different examples in there, but all of those pieces tied together into these tools which help us flourish as people, as humans, they can do the work for us. And the smarter we make them, the more work they can do for us. And then it is our, responsibility to take that platform and to do something even greater with it as a species.
[00:39:49] – Claire Haidar
Phil, thank you. I honestly think that is perhaps one of the most elegant breakdowns that I’ve, and I’ve been in many conversations about this. And I really think, you know, for our audience, it’s a brilliant breakdown.
[00:40:03]
Another big, bold question before I hand over to Doug is, should it be our personal mission as human beings to figure out ways to essentially completely work ourselves out of our own jobs? So let’s just take me, for example, CEO of a tech company board member on multiple boards. Should I be trying to figure out a way to get myself out of that work so that I can do more for humanity?
[00:40:30] – Phil Harvey
You as a CEO is a great example, because isn’t a CEO’s job to become entirely redundant in the business? The best CEOs have the most amount of time, maybe. And so I think a CEO is a good example. Your entire job is to get yourself more time. The business is successful if you can. Be focused fully on what’s next, and I think that’s a good analogy for people. Some people want to live their life as they see fit, and I don’t think we should put pressure on them to do otherwise.
[00:41:08]
Some people want to change the world and make the world a better place for everybody. And both lives are valid when we talked about purpose before, to define what is right for a human is not anybody’s job. We are creatures on this world of, you know, limited lifespan. But if you want to and you have the vision to do that, yes. It’s your responsibility to work out how to make the world better for everyone. And I’m not sure that many jobs could be considered good for the people doing them. Whether that’s their mental health, whether it’s the environment they live in, or any of those things. So yes, those of us who want to should be focused on, removing all work and encouraging human flourishing wherever it can be found.
[00:42:03] – Doug Foulkes
Phil, I don’t know if this is the same answer but the question is very similar but maybe looking at it from a different angle. I’m not a CEO, I’m a regular guy in a job.
[00:42:12]
What should I be thinking about with regards to automation of my job?
[00:42:17] – Phil Harvey
I think it goes back to that question of you should demand what technology can do for you. And that could be, you know, giving you the opportunity to teach a machine to be your companion in that role, to do the things that you don’t want to. If you’re in a job and everything you do you want to, and there’s nothing about it that you dislike. Congratulations, you win. Because there are things that you don’t want to do, ask why can’t technology do this for me?
[00:42:50]
And the more you ask that question of CEOs and people who should be focused on what’s next for the business, you should ask, you know, I want to spend you know, let’s take a really simple example. Let’s say you work with customers and you get a huge amount of energy from running workshops with people for example. Why don’t you spend all of your time doing that and generate value for the business by doing that? If the CEO is not focused on enabling you to do that through technology, then something’s broken in the system.
[00:43:19]
And there’s a good thing to fix there, just as a simple example.
[00:43:25]
Phil we’re coming towards the end of our time together. It’s been an absolutely incredible forty-five minutes so far. I started with a sort of an easy question about purpose and legacy, I’ve got another easy one to finish off. As employees and employers, what civil duty rests on us with regards to ensuring that ethics become and remain the backbone of our futures?
[00:43:49] – Phil Harvey
So this is in some ways, I think, an easy question, because we’re in an environment where we’re able to discuss that.
[00:43:58]
So businesses need to enable that conversation. And people who have the possibility of having that conversation needs to have it, if you settle an ethical issue or you’re in an environment where it’s not possible, that needs to be solved. So the duty is to keep this movement going where we’re talking about the ethics of technology and keeping that conversation relevant, because we can. If we stop because it’s difficult or we stop because, you know, we think that something else is more important, the ability to do it will be taken away from us, not because anybody will intentionally do it, because the system of work in itself will focus on something else.
[00:44:49]
We saw this with, you know, everything from the big data boom where suddenly people could talk about storing data outside of relational database systems, to the use of machine learning and AI in the wider world, all the way through to the ethics of technology. Each of these things will die and become ignored if we don’t keep talking about them, they live on the interest that we have in them. And then they will consolidate down into platforms and foundations that we will start to take for granted.
[00:45:24]
And now is the time to do the work, if that makes sense.
[00:45:28] – Claire Haidar
That’s a very big, bold call. It’s a huge responsibility.
[00:45:34] – Phil Harvey
It is. But what you found, I think I’ll call out Lisa Talia again here. She is for me in my mind in my world of work, a representative of the digital humanities.
[00:45:45]
And this is a relatively new field to be named but we have that now. So we need to encourage people, you know, the anthropologists, the sociologists, the psychologists, the philosophers, all these people to take advantage of that. Yes, it’s big and bold, but this is where work comes from. This is where hugely fulfilling work can come from.
[00:46:08] – Claire Haidar
Yeah. Phil I’m honestly really sad that we’re coming to the end of our conversation together.
[00:46:15] – Phil Harvey
Thank you.
[00:46:15] – Claire Haidar
And I’d like to end off with a very particular question, but I also want you to please share with us about your book. I’ve read snippets of it, I haven’t yet read the whole thing. I can’t wait to do that, to dig into it.
[00:46:32]
And I would love to hear from you.
[00:46:39]
What is the single most important question we should all be asking ourselves today with regards to our future careers and our future work that we’re, that we haven’t covered in today’s conversation? And I’m specifically linking that back to your book because, you know, from the snippets that are available out there to be read about your book at this point in time, I think you’re alluding to a lot of those questions that we need to be asking ourselves.
[00:47:03] – Phil Harvey
Yes, I am just realizing in that, that this is a very aspirational conversation that we’re having and it’s kind of twofold. I want to stay with that, and in the context of the book say that; if you’ve ever felt that empathy is not for you, that may sound like a strange statement to yourself or Doug, but there are people who have rejected it and gone ‘I don’t want to think about other people. I just want to focus on technology.’ Empathy will make you more successful in the work that you do.
[00:47:42]
And there is a form of empathy called cognitive empathy, which is really easy to practice and learn and will even make your technical work better. So it’s not a big ask. And yes, I’m kind of promoting my book because I wrote it for this purpose, but a book’s a relatively big endeavor. So I did it for the reason that I think it’s important. And so that question is, how much more can I know about other people?
[00:48:11]
There are people in here somewhere. If I can’t see them, where are they? Because you’re not doing, nobody’s doing work completely separated from people, whatever you’re doing, there’s people in there somewhere. The more time you put in to understand them, the better you’ll be at your job and if that is making more money, great. If that is spending less time doing your job and more time being social, great. Empathy can help you do that. And I fully believe that and it’s a question that everybody can ask themselves regardless of their role.
[00:48:48] – Claire Haidar
Love it.
[00:48:49]
Phil I can honestly say to you, and I’ll let Doug weigh in with his thoughts, but these last 50 minutes, I’ve perhaps learned the equivalent of what I learned in three years at university. So thank you. Thank you for being here with us today. And thank you for sharing your deep, deep wisdom with us. I am really looking forward to your book and we’ll definitely make sure to share with our listeners, you know, links to access your book, because I know that a lot of them would also be interested in it.
[00:49:18] – Phil Harvey
Thank you so much.
[00:49:19]
That’s really kind of you to say.
[00:49:20]
I’m flattered and a bit embarrassed.
[00:49:24] – Phil Harvey
Phil from my side also, it’s really been a big eye-opener for me. It’s been an incredible conversation. Thank you very much for your wisdom and your thoughts.
[00:49:34] – Phil Harvey
Well, thank you for giving me this opportunity, it’s very kind of you both.
[00:49:38] – Doug Foulkes
Deep wisdom indeed and if you haven’t done so yet, I strongly suggest you read Phil’s new book. Links in the description below.
[00:49:46]
And that’s it for today. We hope you’ve enjoyed this podcast. If you have, we look forward to inviting you back sometime soon.
[00:49:53]
Just a reminder, for more information about WNDYR and the integration services they supply, you can visit their website at WNDYR dot com. And as always, from me, Doug Foulkes and Chaos and Rocketfuel. Stay safe and we’ll see you soon.