How automation and AI are currently revolutionizing recruiting.
Artificial Intelligence is effectively re-writing the rules of the recruiting process. What’s the point of AI? To make us mere mortals more efficient in the workplace? We’re being told that AI will surely be able to outperform us on most tasks like Sourcing, stack ranking candidates, resume match and follow up communications. So in the future who will be the real Recruiting subject matter expert? The AI or the Human? Artificial Intelligence has hit the headlines with some really bad press about injecting unconscious bias in the recruiting process, about over automating tasks and removing too much human interaction. This episode we discuss the myths, the reality and everything in between. So stick around we’re gonna get started Right NOW.
Guest: Arran Stewart
Job.com
Transcript
0 00:00:00:10 00:00:03:17 – Artificial intelligence is|effectively rewriting the rules
1 00:00:03:17 00:00:05:07 of the recreating process.
2 00:00:05:07 00:00:06:24 So, what’s the point of AI?
3 00:00:06:24 00:00:10:16 To make us mere mortals more|efficient in the workplace?
4 00:00:10:16 00:00:12:16 We’re being told that|artificial intelligence
5 00:00:12:16 00:00:15:02 will surely be able to|outperform recruiters
6 00:00:15:02 00:00:18:12 in most tests, like sourcing,|stack ranking candidates,
7 00:00:18:12 00:00:22:16 resume match and, of course,|follow-up communications.
8 00:00:22:16 00:00:23:12 So, in the future,
9 00:00:23:12 00:00:26:14 who’s going to be the real|recruiting subject matter expert?
10 00:00:26:14 00:00:28:04 The AI or the human?
11 00:00:29:19 00:00:31:28 Artificial intelligence|has hit the headlines
12 00:00:31:28 00:00:33:23 with some really bad press
13 00:00:33:23 00:00:36:11 about injecting unconscious bias
14 00:00:36:11 00:00:38:05 in the recruiting process,
15 00:00:38:05 00:00:40:16 about over automating tasks
16 00:00:40:16 00:00:44:04 and removing too much human interaction.
17 00:00:44:04 00:00:47:01 In this episode, we’re|going to discuss the myths,
18 00:00:47:01 00:00:49:22 the reality and everything in between
19 00:00:49:22 00:00:51:14 with artificial intelligence.
20 00:00:51:14 00:00:52:15 So, stick around.
21 00:00:52:15 00:00:54:00 We’re going to get started right
22 00:00:54:00 00:00:56:22 (rock music)|now.
23 00:01:05:07 00:01:07:03 My next guest is a global keynote speaker
24 00:01:07:03 00:01:10:05 and co-founder of the|blockchain recruitment platform,
25 00:01:10:05 00:01:11:09 Job.com.
26 00:01:11:09 00:01:14:08 He’s been in the recruiting|industry for 15 years now
27 00:01:14:08 00:01:15:23 and he’ consistently sought
28 00:01:15:23 00:01:19:21 to bring recruitment to the|cutting edge of technology.
29 00:01:19:21 00:01:23:22 His former projects included|reinventing the way job content
30 00:01:23:22 00:01:27:09 found candidates through|utilizing matching technology
31 00:01:27:09 00:01:29:06 against job aggregation.
32 00:01:29:06 00:01:31:06 His current venture, Job.com,
33 00:01:31:06 00:01:34:11 is an AI blockchain enabled platform
34 00:01:34:11 00:01:37:03 that seeks to reduce hiring costs
35 00:01:37:03 00:01:38:22 by as much as 70%
36 00:01:38:22 00:01:42:06 while paying candidates|up to 5% hiring bonuses.
37 00:01:42:06 00:01:44:26 He’s also the father of five kids.
38 00:01:44:26 00:01:46:29 Please welcome Arran Stewart.
39 00:01:46:29 00:01:48:07 – Thank you for having me on, Jeff.
40 00:01:48:07 00:01:49:23 It’s a pleasure to be here.
41 00:01:49:23 00:01:50:18 – Fantastic.
42 00:01:50:18 00:01:52:13 So, we’re going to jump right in.
43 00:01:52:13 00:01:55:23 I’ve already given your|background and everything.
44 00:01:55:23 00:01:57:14 So, everybody’s up to speed
45 00:01:57:14 00:01:59:12 as what your background is.
46 00:01:59:12 00:02:01:04 – Used car salesman, is that right?
47 00:02:01:04 00:02:02:18 – Yes.|(Arran laughs)
48 00:02:02:18 00:02:04:03 From England, the worst kind.
49 00:02:04:03 00:02:06:07 – You can never trust a used|car salesman from England.
50 00:02:06:07 00:02:07:02 – No, especially not from England.
51 00:02:07:02 00:02:07:27 (Arran laughs)
52 00:02:07:27 00:02:09:16 – If you’re British or|whatever, I’m joking.
53 00:02:09:16 00:02:13:05 – So, artificial intelligence|has really hit the headlines
54 00:02:13:05 00:02:14:20 in the last year or so.
55 00:02:14:20 00:02:17:04 And it’s gotten a bad name,
56 00:02:17:04 00:02:20:19 some bad press about|injecting unconscious bias
57 00:02:20:19 00:02:23:01 and into the recruiting process
58 00:02:23:01 00:02:25:28 about over automating|the recruiting process,
59 00:02:25:28 00:02:27:10 in certainly tasks,
60 00:02:27:10 00:02:30:00 and removing too much of that human touch
61 00:02:30:00 00:02:32:12 or the human interaction.
62 00:02:32:12 00:02:34:26 Do you think we learned any lessons
63 00:02:34:26 00:02:37:14 from those mistakes that we made?
64 00:02:37:14 00:02:41:24 – I think some individuals|have learned some lessons
65 00:02:43:04 00:02:46:23 and I’ll throw the example for Amazon.
66 00:02:46:23 00:02:48:29 They spent years building their AI
67 00:02:48:29 00:02:50:10 recruitment matching tool,
68 00:02:50:10 00:02:54:00 found that there was unconscious|bias, masculine bias.
69 00:02:54:00 00:02:55:04 And they just scrapped it.
70 00:02:55:04 00:02:56:14 And I don’t know to think
71 00:02:56:14 00:02:58:26 that’s necessarily a|lesson learned by them.
72 00:02:58:26 00:03:00:06 Maybe they’ve got other stuff planned
73 00:03:00:06 00:03:02:04 but just scrapping it and ignoring it
74 00:03:02:04 00:03:04:02 like it’s not a problem for|the rest of the industry
75 00:03:04:02 00:03:05:20 I think isn’t cool.
76 00:03:07:09 00:03:08:25 Breaking down what you’ve just said there
77 00:03:08:25 00:03:10:14 into various bits,
78 00:03:10:14 00:03:13:07 the whole AI and unconscious bias stuff
79 00:03:13:07 00:03:14:07 is very clear.
80 00:03:14:07 00:03:15:13 It like anything.
81 00:03:15:13 00:03:18:20 If you teach a child with a|certain amount of information
82 00:03:18:20 00:03:19:27 and only that information,
83 00:03:19:27 00:03:21:19 they’re going to learn that specifically.
84 00:03:21:19 00:03:23:18 And artificial intelligence|is exactly the same.
85 00:03:23:18 00:03:27:03 So, you talk about a tech industry
86 00:03:27:03 00:03:29:17 that’s 80% male dominated
87 00:03:29:17 00:03:31:06 and AI’s learning from the resumes
88 00:03:31:06 00:03:32:22 that have been successfully placed.
89 00:03:32:22 00:03:35:16 It’s going to look|towards masculine language
90 00:03:35:16 00:03:39:12 and male based, dominated|based resumes to place people.
91 00:03:39:12 00:03:40:26 And it’s kind of interesting
92 00:03:40:26 00:03:44:17 that we hadn’t realized|this before but we have now
93 00:03:44:17 00:03:46:07 and I think from our side,
94 00:03:46:07 00:03:47:22 it’s not a plug for Job.com
95 00:03:47:22 00:03:50:07 but we have specifically hired people now
96 00:03:50:07 00:03:53:06 to retrain our AI to make it less biased.
97 00:03:53:06 00:03:54:08 But I think also, we have to remember
98 00:03:54:08 00:03:55:20 that artificial intelligence,
99 00:03:55:20 00:03:58:00 it isn’t here to replace us.
100 00:03:58:00 00:04:00:08 It’s here to make us more efficient.
101 00:04:00:08 00:04:02:09 And I think that’s the dehumanizing part
102 00:04:02:09 00:04:04:20 that we’ve kind of maybe pushed too much.
103 00:04:04:20 00:04:06:06 Everyone’s like, oh, that’s it.
104 00:04:06:06 00:04:07:16 Everything will be automated.
105 00:04:07:16 00:04:08:11 Everything’s going to be automated.
106 00:04:08:11 00:04:09:23 And we’ve been guilty of that too
107 00:04:09:23 00:04:10:28 from our side of perceiving it,
108 00:04:10:28 00:04:11:24 saying it’s all automated.
109 00:04:11:24 00:04:14:12 No, it just makes humans more efficient.
110 00:04:14:12 00:04:17:00 You need to create technology|with a human touch.
111 00:04:17:00 00:04:18:21 – Yeah, there’s certain|tasks that are good at
112 00:04:18:21 00:04:21:17 and we’re going to be touching|into some of those tasks
113 00:04:21:17 00:04:25:05 and what tasks AI, or|even just automation,
114 00:04:25:05 00:04:27:13 could be replacing.
115 00:04:27:13 00:04:28:29 But let me get into this.
116 00:04:28:29 00:04:31:09 So, what do you think is, in your mind,
117 00:04:31:09 00:04:34:27 the most likely scandal to hit with AI
118 00:04:34:27 00:04:37:24 in the next six months to a year?
119 00:04:37:24 00:04:39:15 – Most likely scandal?
120 00:04:39:15 00:04:42:04 Well, I mean, I don’t know.
121 00:04:42:04 00:04:43:08 (laughs)
122 00:04:43:08 00:04:45:15 I’d probably have a few bits to that.
123 00:04:45:15 00:04:47:17 I actually think that,
124 00:04:47:17 00:04:48:28 and I know we’ve just talked about it,
125 00:04:48:28 00:04:51:13 but I think the bias side of things
126 00:04:51:13 00:04:53:25 is going to rear its head the most.
127 00:04:53:25 00:04:57:05 I think we’ve been|operating with technologies,
128 00:04:57:05 00:04:58:09 and we’re all guilty of that.
129 00:04:58:09 00:04:59:04 Everyone.
130 00:04:59:04 00:05:00:13 I’m not going to name|any names in the industry
131 00:05:00:13 00:05:03:00 but every major platform in the industry
132 00:05:03:00 00:05:05:24 has been using technology to deliver
133 00:05:05:24 00:05:07:19 job seekers into companies
134 00:05:07:19 00:05:09:05 and companies to job seekers
135 00:05:09:05 00:05:11:18 without any real policing or control
136 00:05:11:18 00:05:15:19 to make sure the ML hasn’t|run off on a tangent,
137 00:05:15:19 00:05:17:23 the machine learning hasn’t|run off on a tangent.
138 00:05:17:23 00:05:18:21 So, I want,
139 00:05:19:24 00:05:20:19 go on.
140 00:05:20:19 00:05:21:14 – Machine learning,
141 00:05:21:14 00:05:23:04 so, define what that does
142 00:05:23:04 00:05:26:24 and how it effects the AI|and the learning process.
143 00:05:26:24 00:05:28:10 A lot of people don’t understand.
144 00:05:28:10 00:05:29:28 They hear ML, AI
145 00:05:29:28 00:05:32:27 but they don’t know how|the two are intertwined.
146 00:05:32:27 00:05:35:03 – So, there are many different ways
147 00:05:35:03 00:05:37:04 to apply machine learning
148 00:05:37:04 00:05:40:11 and it depends on what points of data
149 00:05:40:11 00:05:42:18 you, as a tech counselor or company,
150 00:05:42:18 00:05:46:08 are looking to improve the experience
151 00:05:46:08 00:05:48:05 and learning of your|artificial intelligence.
152 00:05:48:05 00:05:50:25 So, I’ll give you and example that we did
153 00:05:50:25 00:05:52:02 and that we do.
154 00:05:53:13 00:05:56:23 When we match a job to a job seeker,
155 00:05:56:23 00:05:58:19 they will be given a jobs by email,
156 00:05:58:19 00:06:00:01 which we’re all very familiar with.
157 00:06:00:01 00:06:03:20 And the jobs they select or|they click on and look at,
158 00:06:03:20 00:06:07:25 we take into account everytime|as a machine learning process
159 00:06:07:25 00:06:10:08 to increase the knowledge that
160 00:06:10:08 00:06:13:02 that was the relevant job for that user.
161 00:06:13:02 00:06:14:10 And we use it for macro sense
162 00:06:14:10 00:06:16:26 in the sense of that was the best match
163 00:06:16:26 00:06:18:08 and also we use it for micro sense
164 00:06:18:08 00:06:23:02 to deliver a more customized,|tailored solution.
165 00:06:23:02 00:06:24:20 So, it’s like, okay, this|person’s really interested
166 00:06:24:20 00:06:26:04 in this particular certain content,
167 00:06:26:04 00:06:27:27 so we’re going to keep|delivering back to them.
168 00:06:27:27 00:06:30:16 And we’ve noticed a trend|that lots of people pick,
169 00:06:30:16 00:06:31:23 who are similar to this person,
170 00:06:31:23 00:06:32:25 pick those sorts of similar jobs,
171 00:06:32:25 00:06:35:22 which must mean they’re|the best matching jobs.
172 00:06:35:22 00:06:37:29 But what happens with that, with trends,
173 00:06:37:29 00:06:39:01 like everybody knows,
174 00:06:39:01 00:06:41:14 it can lead into bias
175 00:06:41:14 00:06:45:25 and actually teaching the|system to go down there,
176 00:06:45:25 00:06:46:21 down the merry path,
177 00:06:46:21 00:06:49:09 which wasn’t actually correct|in the first place at all.
178 00:06:49:09 00:06:50:22 So, that’s an example.
179 00:06:50:22 00:06:51:27 – So, it acts,
180 00:06:53:02 00:06:55:29 one application could be a feedback loop
181 00:06:55:29 00:06:59:02 based on candidate or user behavior,
182 00:06:59:02 00:07:01:21 whatever behavior they take action on
183 00:07:01:21 00:07:04:03 then that is more likely to influence
184 00:07:04:03 00:07:06:13 a AI decision in the future.
185 00:07:06:13 00:07:07:13 – That’s it yep.
186 00:07:07:13 00:07:08:08 Brilliant.
187 00:07:08:08 00:07:09:09 Better summary than me. (laughs)
188 00:07:09:09 00:07:10:06 But that’s it, yeah.
189 00:07:10:06 00:07:11:08 That’s exactly it.
190 00:07:11:08 00:07:13:01 And normally, I mean look,
191 00:07:13:01 00:07:16:14 there are other ways to|use machine learning.
192 00:07:16:14 00:07:19:24 My preference is always to|try and use the behavior
193 00:07:19:24 00:07:23:18 of users as the best|indicator for machine learning
194 00:07:23:18 00:07:25:20 because what’s the best thing for each job
195 00:07:25:20 00:07:27:04 is to get the people that use it
196 00:07:27:04 00:07:28:15 to tell it what they want.
197 00:07:28:15 00:07:33:00 However, if you have a|certain subset to review users
198 00:07:33:00 00:07:35:03 or in a particular, like you say,
199 00:07:35:03 00:07:38:02 sex or location, background, culture,
200 00:07:38:02 00:07:41:12 that can, in essence,|start to deliver bias.
201 00:07:41:12 00:07:43:15 It can start to create a system
202 00:07:43:15 00:07:46:14 that isolates other people that|maybe aren’t quite the same
203 00:07:46:14 00:07:50:29 and it pigeonholes you|into a niche in the end.
204 00:07:50:29 00:07:53:19 And on that scale, that happens.
205 00:07:53:19 00:07:58:19 So, it’s like when you try|and create a generic service,
206 00:07:58:19 00:08:02:08 it can be very difficult|to compartmentalize
207 00:08:02:08 00:08:04:28 and teach a machine to|kind of be all things
208 00:08:04:28 00:08:06:26 to all men and women.
209 00:08:06:26 00:08:10:17 – So, what do you think one|of the biggest obstacles is
210 00:08:10:17 00:08:14:16 in training the AI in|the recruiting process?
211 00:08:14:16 00:08:17:25 Is it picking a very specific use case?
212 00:08:17:25 00:08:20:11 Is it defining the actual problem
213 00:08:20:11 00:08:22:03 that you’re trying to solve?
214 00:08:22:03 00:08:24:18 Or is it just the sheer lack of
215 00:08:24:18 00:08:28:03 or availability of specific types of data
216 00:08:28:03 00:08:32:05 for the machine learning or|the AI to make a decision on?
217 00:08:32:05 00:08:33:27 – I think it’s the lack of data.
218 00:08:33:27 00:08:34:22 – Yeah.
219 00:08:34:22 00:08:36:19 – I think data,
220 00:08:36:19 00:08:37:14 it’s like anything.
221 00:08:37:14 00:08:39:28 The more we know, the more we know.
222 00:08:39:28 00:08:41:22 And that saying is so true.
223 00:08:41:22 00:08:43:24 The more the system knows,|the more data it has,
224 00:08:43:24 00:08:47:20 the better decisions, in|theory, it should make.
225 00:08:47:20 00:08:49:02 Sometimes it’s very difficult
226 00:08:49:02 00:08:52:23 to have all of the possible|information ever needed
227 00:08:52:23 00:08:55:26 to create this ultimate system.
228 00:08:55:26 00:08:57:06 And there’s also again,
229 00:08:57:06 00:09:00:07 this missing humanized|element of common sense,
230 00:09:00:07 00:09:03:27 which is almost learn impossible|to teach technology today.
231 00:09:03:27 00:09:05:20 I’m sure one day it will all happen.
232 00:09:05:20 00:09:06:18 Compassion,
233 00:09:09:20 00:09:11:03 commercial savvy.
234 00:09:11:03 00:09:13:18 Like I see a used car salesman
235 00:09:14:23 00:09:17:18 and I’m like, okay, he’s|worked selling cars forever.
236 00:09:17:18 00:09:20:00 As an AI piece, I won’t realize
237 00:09:20:00 00:09:23:10 that that requires good customer skills,
238 00:09:23:10 00:09:24:26 good interpersonal skills
239 00:09:24:26 00:09:27:09 that could be transferrable|to be the recruitment industry
240 00:09:27:09 00:09:30:16 and they’d make a fantastic recruiter.
241 00:09:30:16 00:09:34:03 That sometimes pretty|difficult for AI to do.
242 00:09:34:03 00:09:35:25 It can look at the core sets of skills
243 00:09:35:25 00:09:38:09 but it can’t use a level of commons sense
244 00:09:38:09 00:09:41:06 that a human being would go,|oh, look I’m selling cars
245 00:09:41:06 00:09:42:22 or telling people
246 00:09:42:22 00:09:44:10 that they’ve got a perfect|opportunity in front of them.
247 00:09:44:10 00:09:45:09 If you’ve got personal,
248 00:09:45:09 00:09:46:24 if you’re a good communicator
249 00:09:46:24 00:09:48:04 and you’ve got good interpersonal skills
250 00:09:48:04 00:09:49:21 then you’d be great for this job.
251 00:09:49:21 00:09:51:06 And that’s another thing.
252 00:09:51:06 00:09:54:14 – The information wasn’t|contextualized in such a way
253 00:09:54:14 00:09:56:22 that the AI could understand it.
254 00:09:56:22 00:09:58:15 It just kind of got put up in a bucket
255 00:09:58:15 00:10:00:06 and said, you know what,|we’ll talk about that later.
256 00:10:00:06 00:10:01:01 – Exactly.
257 00:10:01:01 00:10:01:26 – Don’t really understand it.
258 00:10:01:26 00:10:02:22 – It will,
259 00:10:02:22 00:10:04:17 it may deem some of it relevant
260 00:10:04:17 00:10:06:03 but it won’t deem it first.
261 00:10:06:03 00:10:07:11 Whereas, maybe,
262 00:10:07:11 00:10:08:06 and this has been,
263 00:10:08:06 00:10:12:12 this is me shooting up|my own service myself.
264 00:10:12:12 00:10:15:00 This is the bit where a|recruitment consultant
265 00:10:15:00 00:10:18:08 totally has the competitive|advantage over tech
266 00:10:18:08 00:10:20:02 because they’ll be able to|look at that with common sense
267 00:10:20:02 00:10:21:19 and go, no, I think you|can apply this person
268 00:10:21:19 00:10:22:17 and I spoke to them on the phone
269 00:10:22:17 00:10:23:14 and they sound great
270 00:10:23:14 00:10:25:13 and I can move them into this industry.
271 00:10:25:13 00:10:28:04 – So, if we’re lacking data
272 00:10:28:04 00:10:30:08 for AI to make relevant decisions,
273 00:10:30:08 00:10:32:04 in my opinion, AI right now
274 00:10:32:04 00:10:35:16 based on the historic artifact of a resume
275 00:10:35:16 00:10:38:07 and a horrible written|text based job description,
276 00:10:38:07 00:10:41:01 which may or may not have the job title
277 00:10:41:01 00:10:42:20 or job description|- Or a video.
278 00:10:42:20 00:10:44:07 – that we’re really looking of.
279 00:10:44:07 00:10:47:14 If you can make a 70 to 80% match
280 00:10:47:14 00:10:51:21 on a stack rank or even in|just a search and match,
281 00:10:51:21 00:10:56:01 I think that’s a really|good threshold, if you will.
282 00:10:56:01 00:10:57:28 What other kind of data points
283 00:10:57:28 00:11:01:13 or what other data|sources could we pull in
284 00:11:02:16 00:11:05:12 to get a higher match rate?
285 00:11:05:12 00:11:08:18 – So, there’s lots of|talk about obviously,
286 00:11:08:18 00:11:10:05 and they’ve been around for awhile,
287 00:11:10:05 00:11:12:22 and I guess obviously making|the information useful
288 00:11:12:22 00:11:16:06 but looking at people’s social profiles
289 00:11:16:06 00:11:17:22 and the way that they behave,
290 00:11:17:22 00:11:18:21 they kind of,
291 00:11:18:21 00:11:22:09 their behavioral and|their cultural behavior
292 00:11:22:09 00:11:26:16 that now is becoming more|and more relevant than ever
293 00:11:26:16 00:11:31:14 in a workplace and also|identifying cultural fix for teams.
294 00:11:32:17 00:11:34:26 Like you’ve got a team, a set of people
295 00:11:34:26 00:11:36:12 and they’ve all got different skills
296 00:11:36:12 00:11:39:09 but they’ve all got different|personality sets as well,
297 00:11:39:09 00:11:43:12 which kind of makes them|cohesive or noncohesive.
298 00:11:43:12 00:11:44:28 And I think that’s a bit
299 00:11:44:28 00:11:49:15 where AI will become even|more important and efficient.
300 00:11:51:11 00:11:52:09 I know people who are doing it already.
301 00:11:52:09 00:11:53:28 And we’re even looking|at stuff like this now.
302 00:11:53:28 00:11:56:01 I’d be lying if I was|saying we’re ahead with it.
303 00:11:56:01 00:11:57:13 We’re not
304 00:11:57:13 00:11:59:22 but we’re looking at, how do|you take AI to another level
305 00:11:59:22 00:12:01:17 where it starts analyzing|beyond the skills,
306 00:12:01:17 00:12:02:19 beyond the resume?
307 00:12:02:19 00:12:04:20 How do we understand|who this human being is?
308 00:12:04:20 00:12:08:10 How can we look at their digital footprint
309 00:12:08:10 00:12:09:17 of how they behave online
310 00:12:09:17 00:12:11:02 that might identify whether or not
311 00:12:11:02 00:12:12:08 they’d be a good fit for this company,
312 00:12:12:08 00:12:13:14 they’d be a good fit for the team
313 00:12:13:14 00:12:15:10 that we want to cohere them into.
314 00:12:15:10 00:12:20:08 – Now, to clarify, are we|talking about their personal,
315 00:12:20:08 00:12:21:10 open, public,
316 00:12:22:16 00:12:24:23 – Anything that can be searched and found
317 00:12:24:23 00:12:25:25 is subject to scrutiny.
318 00:12:25:25 00:12:27:21 – Anything you want to|be kept private though
319 00:12:27:21 00:12:29:13 can be kept private.
320 00:12:29:13 00:12:32:03 – So, on that’s the|responsibility of the job seeker.
321 00:12:32:03 00:12:34:11 So, you must understand,
322 00:12:34:11 00:12:35:24 I haven’t got the statistics to hand
323 00:12:35:24 00:12:37:29 but the majority of Fortune 500
324 00:12:37:29 00:12:39:09 will do a social search for you
325 00:12:39:09 00:12:42:14 to make sure that you’re not a tyrant.
326 00:12:44:15 00:12:47:03 And so, any job seekers|listening out there,
327 00:12:47:03 00:12:48:17 if there’s stuff that’s personal
328 00:12:48:17 00:12:50:03 that you don’t want the world to see,
329 00:12:50:03 00:12:51:27 make sure you have a private profile.
330 00:12:51:27 00:12:53:20 Otherwise, if you leave it public,
331 00:12:53:20 00:12:54:28 it’s fair game.
332 00:12:55:29 00:12:59:27 But yeah, I think there will be a deeper,
333 00:12:59:27 00:13:01:14 intimate look into,
334 00:13:02:17 00:13:05:11 because you can’t|distinguish anymore between
335 00:13:05:11 00:13:07:13 like my day to day human life
336 00:13:07:13 00:13:09:19 and then my kind of|alter ego that’s online
337 00:13:09:19 00:13:11:16 because it’s,
338 00:13:11:16 00:13:12:28 that’s who I am.
339 00:13:12:28 00:13:15:24 We live our lives through the internet
340 00:13:15:24 00:13:17:00 a lot of the time.|- Some of us do.
341 00:13:17:00 00:13:18:08 – Well, some of us,
342 00:13:18:08 00:13:20:01 – I just met somebody actually,
343 00:13:20:01 00:13:22:22 one of my buddies I went|to high school with.
344 00:13:22:22 00:13:25:26 No social or digital footprint whatsoever.
345 00:13:25:26 00:13:29:12 No Facebook, no LinkedIn,|no Twitter account.
346 00:13:29:12 00:13:31:20 I was like, wow. (laughs)
347 00:13:31:20 00:13:34:06 – I mean, I would be lost with that
348 00:13:34:06 00:13:36:00 but I maybe of this,
349 00:13:36:00 00:13:36:25 I’m guilty of this.
350 00:13:36:25 00:13:39:19 I’m a millennial and I’m also,
351 00:13:39:19 00:13:43:09 everything I do is geared|towards what will Gen Zed do
352 00:13:43:09 00:13:45:23 because that’s my mark of interest.
353 00:13:45:23 00:13:47:22 I’m interested in what the,
354 00:13:47:22 00:13:48:17 millennials are important
355 00:13:48:17 00:13:51:13 but millennials in 2021|will be 40 years old,
356 00:13:51:13 00:13:52:14 the oldest ones.
357 00:13:52:14 00:13:53:12 So, they’re actually reaching a point
358 00:13:53:12 00:13:56:15 where they’re not really|that young anymore.
359 00:13:56:15 00:13:57:19 I want to know what Gen Zed’s going to do
360 00:13:57:19 00:13:58:22 over the next decade
361 00:13:58:22 00:14:00:06 and pretty much all of them
362 00:14:00:06 00:14:02:27 have an alter ego digital footprint.
363 00:14:02:27 00:14:07:23 And so, it’s like, okay, use|an artificial intelligence
364 00:14:07:23 00:14:09:00 with a level of
365 00:14:11:04 00:14:11:29 depth.
366 00:14:11:29 00:14:13:08 Can we understand the personality,
367 00:14:13:08 00:14:14:17 the culture of this individual?
368 00:14:14:17 00:14:16:12 What they might be like.
369 00:14:16:12 00:14:17:22 Are there certain things that they post?
370 00:14:17:22 00:14:19:19 Are there certainly words that they say,
371 00:14:19:19 00:14:21:07 behavior that they express
372 00:14:21:07 00:14:23:12 that may make them a better fit,
373 00:14:23:12 00:14:27:02 a worse fit, a clash|between existing hires?
374 00:14:29:08 00:14:31:03 – If you think about it,
375 00:14:31:03 00:14:33:25 my digital footprint didn’t start until
376 00:14:33:25 00:14:35:25 really the ’90s or so, right?
377 00:14:35:25 00:14:38:03 So, I’ve been online since,
378 00:14:39:17 00:14:40:12 I hate to say it,
379 00:14:40:12 00:14:43:02 but 1982, somewhere around that.
380 00:14:44:01 00:14:45:22 – I was born in ’85.
381 00:14:45:22 00:14:47:25 – Okay so,|(Arran laughs)
382 00:14:47:25 00:14:49:13 I was on bulletin board systems
383 00:14:49:13 00:14:51:08 with a 9.6 baud modem.
384 00:14:52:24 00:14:55:08 You know the (imitates|old online dial-up),
385 00:14:55:08 00:14:56:03 that whole thing.
386 00:14:56:03 00:14:59:03 So, I grew up during U.S.|robotics and all that stuff.
387 00:14:59:03 00:15:02:10 I’m quite a few years older than you.
388 00:15:02:10 00:15:05:21 But my digital footprint didn’t start
389 00:15:05:21 00:15:08:03 until I was well into my 20s.
390 00:15:10:10 00:15:15:10 Now, you have people of your|generation and Gen Zed, or Z,
391 00:15:15:13 00:15:16:08 – Yeah, Gen Z.
392 00:15:16:08 00:15:18:01 – they really,
393 00:15:18:01 00:15:20:10 their entire lives have|been documented online
394 00:15:20:10 00:15:21:29 in some way, shape or form.
395 00:15:21:29 00:15:23:25 Their parents probably|took pictures of them,
396 00:15:23:25 00:15:26:07 posted it on Facebook.
397 00:15:26:07 00:15:27:27 Even your unborn daughter,
398 00:15:27:27 00:15:30:04 she’s getting stuff.
399 00:15:30:04 00:15:30:29 – Yeah, she is.
400 00:15:30:29 00:15:31:24 I actually think,
401 00:15:31:24 00:15:32:19 – In the womb, right?|- I actually sometimes
402 00:15:32:19 00:15:33:21 think for this generation
403 00:15:33:21 00:15:35:19 and my children’s generation,
404 00:15:35:19 00:15:39:07 that it’s almost like online|first, real life second
405 00:15:39:07 00:15:40:29 because I’ve actually,
406 00:15:40:29 00:15:42:06 and I’ve witnessed this firsthand,
407 00:15:42:06 00:15:43:29 so I’ll go on a mild tangent.
408 00:15:43:29 00:15:47:19 But I’ve met people who|I’ve been friends with
409 00:15:47:19 00:15:52:02 or followed on Instagram|and Facebook, Twitter.
410 00:15:52:02 00:15:53:05 And then I’ve met them in real life
411 00:15:53:05 00:15:55:00 and they really don’t come across
412 00:15:55:00 00:15:58:05 anything like the way they are online
413 00:15:58:05 00:16:01:07 because online they’re kind|of hidden behind this wall of,
414 00:16:01:07 00:16:02:02 this protect,
415 00:16:02:02 00:16:04:16 they act and behave maybe in|the way they want to behave
416 00:16:04:16 00:16:07:15 but in the real world they|kind of almost very closed off
417 00:16:07:15 00:16:08:22 and very different.
418 00:16:08:22 00:16:10:11 And it’s kind of,
419 00:16:10:11 00:16:12:03 I wonder what will happen
420 00:16:12:03 00:16:14:11 with this kind of digital virtual world.
421 00:16:14:11 00:16:17:16 But without going back|to employment and AI,
422 00:16:17:16 00:16:18:23 I think that that data
423 00:16:18:23 00:16:20:14 will become increasingly more important
424 00:16:20:14 00:16:21:16 because other than that,
425 00:16:21:16 00:16:23:12 there’s only so much you can|analyze from a job description.
426 00:16:23:12 00:16:26:21 There’s only so much you|can analyze from a resume.
427 00:16:26:21 00:16:30:10 Video and the content behind video,
428 00:16:30:10 00:16:33:17 we would be able to use sentiment analysis
429 00:16:33:17 00:16:36:07 but based on facial recognition.
430 00:16:37:05 00:16:38:20 Will people allow that?
431 00:16:38:20 00:16:41:03 Will they allow us to know whether or not
432 00:16:41:03 00:16:42:20 they looked to, whatever it is,
433 00:16:42:20 00:16:43:19 the left or the right
434 00:16:43:19 00:16:46:04 and they lied (laughs) about something.
435 00:16:46:04 00:16:47:06 All of this stuff,
436 00:16:47:06 00:16:48:08 I think in the government,
437 00:16:48:08 00:16:49:17 there’ll probably be opportunities
438 00:16:49:17 00:16:51:22 where facial recognition|interviews and video
439 00:16:51:22 00:16:52:27 will be,|- They already do it.
440 00:16:52:27 00:16:53:22 – Yeah, exactly.
441 00:16:53:22 00:16:55:17 They’ll be all over it.
442 00:16:55:17 00:16:58:17 And they’ll analyze you|on all of that stuff
443 00:16:58:17 00:17:02:28 to decided whether or not|you’re the right employee.
444 00:17:02:28 00:17:04:25 – So, you and Steve,
445 00:17:04:25 00:17:07:00 one of my good friends,|known him for years,
446 00:17:07:00 00:17:08:25 Steve O’Brien, wrote an article
447 00:17:08:25 00:17:10:01 and it got picked up by,
448 00:17:10:01 00:17:12:07 I guess, Nasdaq and a couple other places
449 00:17:12:07 00:17:14:07 and it was really about|the future of work.
450 00:17:14:07 00:17:16:18 What was the kind of impetus
451 00:17:16:18 00:17:19:06 or reason behind writing that article?
452 00:17:19:06 00:17:21:12 – So, everyone from McKenzie to Deloitte
453 00:17:21:12 00:17:23:16 have written these big reports
454 00:17:23:16 00:17:25:19 about what the future of work looks like.
455 00:17:25:19 00:17:27:12 I’ll quote Deloitte statement,
456 00:17:27:12 00:17:31:11 which was children age|five today, in 15 years,
457 00:17:31:11 00:17:34:08 2/3 of them will be doing|jobs that don’t exist.
458 00:17:34:08 00:17:37:23 And so, the future of work and skills gap,
459 00:17:38:28 00:17:40:08 AI’s automation,
460 00:17:42:20 00:17:44:22 all of those things|are like doom and gloom
461 00:17:44:22 00:17:47:02 and what’s going to happen to all of us?
462 00:17:47:02 00:17:49:20 What happens to the skilled labor
463 00:17:49:20 00:17:51:23 or the labor that’s now no longer needed
464 00:17:51:23 00:17:54:22 because the machines taken over?
465 00:17:54:22 00:17:56:19 And also, what do we need to learn?
466 00:17:56:19 00:17:59:14 And so, Steve and I wanted to talk about,
467 00:17:59:14 00:18:01:04 actually wanted to end on a positive note,
468 00:18:01:04 00:18:02:11 so we actually have a sequel
469 00:18:02:11 00:18:04:06 coming to that particular article
470 00:18:04:06 00:18:05:10 that will be out soon.
471 00:18:05:10 00:18:06:05 – I was going to say,
472 00:18:06:05 00:18:08:03 because you kind of left me hanging there.
473 00:18:08:03 00:18:09:28 – On purpose.
474 00:18:09:28 00:18:12:16 It’s a three part piece.
475 00:18:12:16 00:18:13:15 – I’ve got to wait.
476 00:18:13:15 00:18:14:10 (Arran laughs)
477 00:18:14:10 00:18:15:05 – We’re working on the second piece.
478 00:18:15:05 00:18:16:25 But the topic is this.
479 00:18:16:25 00:18:17:25 There’s a couple of things.
480 00:18:17:25 00:18:19:03 What do we need to learn?
481 00:18:19:03 00:18:20:18 And what do we need to teach our children?
482 00:18:20:18 00:18:22:24 If we can’t predict what|the jobs are in the future,
483 00:18:22:24 00:18:24:03 what can we teach them?
484 00:18:24:03 00:18:25:16 And Steve and I have a theory
485 00:18:25:16 00:18:26:28 that one of the most important skills
486 00:18:26:28 00:18:28:10 you’re going to need to have in the future
487 00:18:28:10 00:18:30:15 is the ability to make a decision.
488 00:18:30:15 00:18:31:20 And I know that sounds kind of,
489 00:18:31:20 00:18:32:24 well, yeah, that’s,
490 00:18:32:24 00:18:34:23 no it’s not obvious.
491 00:18:34:23 00:18:37:24 Have the ability to|take information quickly
492 00:18:37:24 00:18:40:19 and based on gut and based|on a level of common sense,
493 00:18:40:19 00:18:43:20 can you make a decision|and can you act on it?
494 00:18:43:20 00:18:46:01 – That was one of the key findings
495 00:18:46:01 00:18:48:15 that I got out of that article
496 00:18:48:15 00:18:51:12 was if you look at an organization,
497 00:18:52:08 00:18:54:22 how many people are actually empowered
498 00:18:54:22 00:18:58:12 to make good decisions|for the organization?
499 00:19:00:00 00:19:01:13 I would say that there’s very few
500 00:19:01:13 00:19:03:13 and they’re usually at the top.
501 00:19:03:13 00:19:06:07 Embedding that decision making capability,
502 00:19:06:07 00:19:09:18 I think, is going to be|absolutely critical moving forward
503 00:19:09:18 00:19:14:18 and driving those decision|making capabilities and abilities
504 00:19:14:24 00:19:18:13 for your employees to be|able to do that properly.
505 00:19:18:13 00:19:20:13 Drive that down into the organization
506 00:19:20:13 00:19:22:10 at some of your lowest levels
507 00:19:22:10 00:19:23:11 is absolutely critical.
508 00:19:23:11 00:19:27:15 You can’t slap people’s|hands for making a decisions,
509 00:19:27:15 00:19:29:17 especially if it’s the wrong decision.
510 00:19:29:17 00:19:30:28 People make wrong decisions all the time.
511 00:19:30:28 00:19:33:02 – And then it’s the ability|to learn from that decision
512 00:19:33:02 00:19:35:11 and also the ability to fail fast.
513 00:19:35:11 00:19:36:06 – Yes.
514 00:19:36:06 00:19:37:24 – Failure should never be seen as failure.
515 00:19:37:24 00:19:39:15 It should be a step towards success
516 00:19:39:15 00:19:41:18 as long as you fail quickly.
517 00:19:41:18 00:19:43:23 – You successfully found|a way not to do something.
518 00:19:43:23 00:19:47:25 – Yeah, we successfully found|a way to a bloody mistake.
519 00:19:47:25 00:19:49:17 And then you move on.
520 00:19:50:18 00:19:52:05 The other part is this,
521 00:19:52:05 00:19:55:27 the actual, the commodity|that is knowledge.
522 00:19:55:27 00:19:58:03 Once upon a time,
523 00:19:58:03 00:20:00:06 you’d have to go to a|very specialty school
524 00:20:00:06 00:20:02:19 to learn particular|things, particular skills,
525 00:20:02:19 00:20:04:06 data science, stuff like that.
526 00:20:04:06 00:20:05:16 There is now everything
527 00:20:05:16 00:20:07:21 you could possible ever want to learn
528 00:20:07:21 00:20:10:04 is just a keyboard away.
529 00:20:10:04 00:20:13:29 And if you have the tenacity and drive
530 00:20:13:29 00:20:16:16 to go and just read and learn,
531 00:20:16:16 00:20:20:27 you could learn almost|anything for basically free
532 00:20:20:27 00:20:23:01 almost anywhere in the world.
533 00:20:23:01 00:20:26:12 And you’ve just got to|be bothered to do that.
534 00:20:26:12 00:20:29:00 And so, it’s this kind of ever,
535 00:20:29:00 00:20:32:01 are you an individual who is static?
536 00:20:32:01 00:20:33:08 Or are you dynamic?
537 00:20:33:08 00:20:34:21 And this is kind of maybe revealing more
538 00:20:34:21 00:20:36:20 of what we’re going to|do in the next sequel.
539 00:20:36:20 00:20:38:17 (Jeff laughs)
540 00:20:38:17 00:20:39:22 – Oo, you heard it here first.
541 00:20:39:22 00:20:41:20 – Yeah, heard it here first.
542 00:20:41:20 00:20:46:08 But your ability to go out|there and capture information.
543 00:20:46:08 00:20:50:03 And then I’ll go straight|into the final parts.
544 00:20:50:03 00:20:50:28 I won’t say all of it
545 00:20:50:28 00:20:55:26 but it’s this understanding|of when we say doom and gloom.
546 00:20:55:26 00:20:59:16 Do we look at doom and|gloom as a national problem
547 00:20:59:16 00:21:01:27 or a problem for the United States?
548 00:21:01:27 00:21:06:00 Or do we look at the global|impact of technology,
549 00:21:06:00 00:21:08:01 which has resulted in global poverty
550 00:21:08:01 00:21:11:19 almost halving in the last two decades.
551 00:21:11:19 00:21:13:09 Inequality has increased|in the United States
552 00:21:13:09 00:21:16:04 but global poverty has halved|in the last two decades,
553 00:21:16:04 00:21:19:29 which means it’s screwing|things up for some people
554 00:21:19:29 00:21:22:02 but for the lion’s share,
555 00:21:22:02 00:21:24:17 this whole future of work|and what everyone’s doing
556 00:21:24:17 00:21:27:09 is actually good for the whole.
557 00:21:27:09 00:21:28:22 And it’s like,
558 00:21:28:22 00:21:30:16 it could be very easy,
559 00:21:30:16 00:21:32:08 and especially if you’re|on the receiving end
560 00:21:32:08 00:21:33:23 of where it goes wrong,
561 00:21:33:23 00:21:35:08 it can be very easy to just be like,
562 00:21:35:08 00:21:37:14 well, screw the rest of the world,
563 00:21:37:14 00:21:38:09 what’s going to happen to America?
564 00:21:38:09 00:21:39:16 And that’s true.
565 00:21:39:16 00:21:40:19 I’m all pro America.
566 00:21:40:19 00:21:41:14 I live here now.
567 00:21:41:14 00:21:42:14 I’ve got two children,|almost two children,
568 00:21:42:14 00:21:44:03 been born here now.
569 00:21:44:03 00:21:45:14 But at the same time,
570 00:21:45:14 00:21:47:11 I think we have to try and look at
571 00:21:47:11 00:21:52:00 that there is a definite|landscape shift happening
572 00:21:52:00 00:21:54:29 because of tech, because|of the future of work.
573 00:21:54:29 00:21:58:01 Globally it’s good in sense|of it’s helping people.
574 00:21:58:01 00:21:59:20 It’s removing them from poverty.
575 00:21:59:20 00:22:02:17 On a more micro scale,|nationally, it’s not
576 00:22:02:17 00:22:06:16 because it is effecting|natives and people here
577 00:22:06:16 00:22:07:17 and that there’s a bigger inequality
578 00:22:07:17 00:22:09:07 and more people are unemployed there
579 00:22:09:07 00:22:11:02 because technology’s taken their jobs.
580 00:22:11:02 00:22:13:12 And what are we going to do about it?
581 00:22:13:12 00:22:14:20 What are we going to do about it?
582 00:22:14:20 00:22:17:01 – One of the things that|came form the article
583 00:22:17:01 00:22:21:09 was that for most people,|the job that they do,
584 00:22:21:09 00:22:23:03 the career that they have,
585 00:22:23:03 00:22:27:10 is really, it’s part of|their overall personality.
586 00:22:27:10 00:22:29:07 It’s basically, who they are
587 00:22:29:07 00:22:30:11 and how they function.
588 00:22:30:11 00:22:31:06 They get up.
589 00:22:31:06 00:22:32:01 They go to work.
590 00:22:32:01 00:22:33:14 They do their job and then they come home.
591 00:22:33:14 00:22:34:21 They spend more time at work
592 00:22:34:21 00:22:37:09 than they ever do at home.
593 00:22:37:09 00:22:41:04 And as AI takes over some of|the more repetitive tasks,
594 00:22:41:04 00:22:43:17 some of the more complex calculations,
595 00:22:43:17 00:22:45:15 some of the high volume communications
596 00:22:45:15 00:22:49:12 that are just embedded|in our everyday job,
597 00:22:49:12 00:22:52:13 you’re not going to have those|responsibilities anymore.
598 00:22:52:13 00:22:55:11 It’s going to be largely taken over by AI,
599 00:22:55:11 00:22:57:16 especially in the recruiter space.
600 00:22:57:16 00:22:59:19 You probably won’t need to source
601 00:22:59:19 00:23:01:19 or send out emails to candidates
602 00:23:01:19 00:23:03:01 or check your social media.
603 00:23:03:01 00:23:06:26 It’s all just going to be|attached to a job somehow.
604 00:23:06:26 00:23:10:00 How do you think people|are going to be reacting to
605 00:23:10:00 00:23:12:10 their changing roles and responsibilities
606 00:23:12:10 00:23:14:10 and their self identity?
607 00:23:15:09 00:23:16:22 – So, one of the things,
608 00:23:16:22 00:23:18:21 and you’ve kind of mentioned it,
609 00:23:18:21 00:23:21:13 is working is one of|the most natural things
610 00:23:21:13 00:23:23:00 that we can do as human beings.
611 00:23:23:00 00:23:27:27 So, work is just as natural|as love, as friendship
612 00:23:27:27 00:23:30:03 because it gives us a purpose.
613 00:23:30:03 00:23:31:16 It gives us an identity.
614 00:23:31:16 00:23:33:18 As you said it’s one of|the reasons we get up.
615 00:23:33:18 00:23:34:22 It gives us routine.
616 00:23:34:22 00:23:36:10 It’s actually very important
617 00:23:36:10 00:23:38:01 and for thousands and thousands of years
618 00:23:38:01 00:23:41:06 for the history of human beings,
619 00:23:41:06 00:23:42:09 work has been responsible
620 00:23:42:09 00:23:45:05 for the majority part of our life.
621 00:23:45:05 00:23:48:18 So, I think there’s and element of,
622 00:23:48:18 00:23:49:29 and I actually spoke to Steve earlier
623 00:23:49:29 00:23:51:08 and he gave a great analogy.
624 00:23:51:08 00:23:55:01 He’s like, technology should|increase the amount of time
625 00:23:55:01 00:23:56:27 that we all have to have coffee together.
626 00:23:56:27 00:23:59:05 (Arran laughs)
627 00:23:59:05 00:24:00:27 And it was a great,
628 00:24:00:27 00:24:02:07 and I kind of paused and I thought,
629 00:24:02:07 00:24:03:02 what do you mean by that.
630 00:24:03:02 00:24:05:14 And he goes well, getting rid|of the mundane, boring tasks
631 00:24:05:14 00:24:08:08 means we’ve got more time to|spend with each other, right?
632 00:24:08:08 00:24:10:22 And he’s like, it’s true.
633 00:24:10:22 00:24:12:15 – I don’t think that’s going to happen.
634 00:24:12:15 00:24:13:11 (Arran laughs)|What I think is,
635 00:24:13:11 00:24:14:19 – No, of course it won’t.
636 00:24:14:19 00:24:16:05 That was idealistic
637 00:24:16:05 00:24:18:00 but we’ll all have coffee together.
638 00:24:18:00 00:24:20:05 But I think his sentiment,
639 00:24:20:05 00:24:21:24 what he meant by that was
640 00:24:21:24 00:24:25:09 it could elevate a lot|of the mundane tasks.
641 00:24:25:09 00:24:28:14 However, we have to look|at the remuneration piece
642 00:24:28:14 00:24:32:13 because remuneration is|based on hours served
643 00:24:32:13 00:24:34:11 rather than tasks completed.
644 00:24:34:11 00:24:36:12 Well, depends where you are.
645 00:24:36:12 00:24:37:11 – Right.
646 00:24:37:11 00:24:40:21 So, you’re either getting paid for your,
647 00:24:40:21 00:24:41:16 the work that you’re doing
648 00:24:41:16 00:24:43:09 or the hour that you spent dong it
649 00:24:43:09 00:24:46:19 or you’re getting paid for your|knowledge and your intellect
650 00:24:46:19 00:24:49:11 and your thought process.
651 00:24:49:11 00:24:51:13 – So again, this is where|certain people fall unstuck
652 00:24:51:13 00:24:54:21 and it’s normally the people|that get destroyed by AI
653 00:24:54:21 00:24:55:16 and these are the,
654 00:24:55:16 00:24:57:13 the ones that get paid for their time
655 00:24:57:13 00:24:58:25 that AI can replace,
656 00:24:58:25 00:25:00:26 are the ones that will suffer
657 00:25:00:26 00:25:02:08 because it’s like, okay, well,
658 00:25:02:08 00:25:04:00 we pay you for your hourly work.
659 00:25:04:00 00:25:05:17 You only did two hours instead of 10 hours
660 00:25:05:17 00:25:08:05 because our machine|here did it all for you.
661 00:25:08:05 00:25:09:25 So, we’re only going|to give you two hours.
662 00:25:09:25 00:25:11:26 And it’s like, great for commerce
663 00:25:11:26 00:25:14:27 because now I’ve just saved|eight hours worth of labor
664 00:25:14:27 00:25:16:03 but terrible for the individual
665 00:25:16:03 00:25:18:06 who has to earn a salary|in order to pay their bills
666 00:25:18:06 00:25:19:20 and feed their families.
667 00:25:19:20 00:25:22:23 And that’s the bit where there|is also this kind of talk
668 00:25:22:23 00:25:24:21 within our art schools,
669 00:25:25:18 00:25:28:21 which is what will the state do?
670 00:25:28:21 00:25:31:21 What will the government do in the end?
671 00:25:31:21 00:25:34:09 We don’t want to live in a,
672 00:25:34:09 00:25:35:05 well, you do and you don’t.
673 00:25:35:05 00:25:36:18 You don’t wan to live in a world
674 00:25:36:18 00:25:39:21 where you’re kind of|just living by the state
675 00:25:39:21 00:25:41:24 and it’s kind of handing|out your living to you
676 00:25:41:24 00:25:44:07 because the robots have taken over.
677 00:25:44:07 00:25:46:10 But at the same time,
678 00:25:46:10 00:25:49:10 there has to be some level|of social responsibility
679 00:25:49:10 00:25:52:13 where we look after|people in a certain way
680 00:25:52:13 00:25:54:05 that they have the ability to,
681 00:25:54:05 00:25:56:27 they have a right to live.
682 00:25:56:27 00:25:58:22 And so, how are we going to combat that
683 00:25:58:22 00:25:59:27 and are we good?
684 00:25:59:27 00:26:02:17 And I think the once the|industrial revolution,
685 00:26:02:17 00:26:03:18 the fourth industrial revolution,
686 00:26:03:18 00:26:04:29 which is what we’re|going through right now,
687 00:26:04:29 00:26:07:17 according to Deloitte and McKenzie
688 00:26:07:17 00:26:08:17 and all of them, in fact,
689 00:26:08:17 00:26:11:26 is that there’s going to|be some significant changes
690 00:26:11:26 00:26:12:21 that are going to come
691 00:26:12:21 00:26:13:28 but one of my other concerns,
692 00:26:13:28 00:26:14:27 an so is Steve’s,
693 00:26:14:27 00:26:17:07 is are governments going to react
694 00:26:17:07 00:26:19:12 fast enough to that change?
695 00:26:19:12 00:26:20:20 And I don’t think they are.
696 00:26:20:20 00:26:22:11 That’s the one thing that worries me.
697 00:26:22:11 00:26:25:26 – So, one of the stats|that was in the article
698 00:26:25:26 00:26:28:09 was that 375 million workers,
699 00:26:30:24 00:26:32:15 across every industry,
700 00:26:32:15 00:26:33:10 will have,
701 00:26:34:11 00:26:35:12 either be displaced
702 00:26:35:12 00:26:39:02 or have their job|functionality, responsibility
703 00:26:39:02 00:26:42:20 significantly changed|in the next 10 years.
704 00:26:44:16 00:26:45:22 That’s massive, right?
705 00:26:45:22 00:26:46:17 – Yeah.
706 00:26:46:17 00:26:49:04 – Is it going to be a|slow, gradual and then,
707 00:26:49:04 00:26:50:27 or is it going to be|kind of slow and gradual
708 00:26:50:27 00:26:52:02 and then a hockey puck?
709 00:26:52:02 00:26:53:06 How’s this going to,
710 00:26:53:06 00:26:55:12 – I think it’s going to hockey.
711 00:26:55:12 00:26:57:06 And it’s like to anything.
712 00:26:57:06 00:27:02:00 Like any good growth curve|of a successful business
713 00:27:02:00 00:27:04:03 or a successful takeover,
714 00:27:05:02 00:27:06:26 like oh, we’re a company,
715 00:27:06:26 00:27:08:25 we can now use machines|to do this all for us.
716 00:27:08:25 00:27:09:21 Let’s implement it.
717 00:27:09:21 00:27:10:19 Overnight suddenly.
718 00:27:10:19 00:27:12:21 Yet ax half the staff,
719 00:27:12:21 00:27:15:01 change half their roles or whatever.
720 00:27:15:01 00:27:16:08 I think it will hockey stick.
721 00:27:16:08 00:27:17:12 And I think this is again,
722 00:27:17:12 00:27:19:07 the responsibility of,
723 00:27:20:07 00:27:22:09 there’s a level of|responsibility for companies
724 00:27:22:09 00:27:25:10 because don’t lose sight of the fact,
725 00:27:25:10 00:27:27:10 as a company whatever|you’re selling or doing
726 00:27:27:10 00:27:28:25 needs consumers.
727 00:27:28:25 00:27:30:28 The robots will not|consume what you produce
728 00:27:30:28 00:27:32:13 or you give or your service.
729 00:27:32:13 00:27:34:29 So, if you have no one to|consume what you offer,
730 00:27:34:29 00:27:36:28 then the robots don’t work anyway.
731 00:27:36:28 00:27:40:01 So, it’s got to be that sort of conscious,
732 00:27:40:01 00:27:40:25 in the back of the mind,
733 00:27:40:25 00:27:41:29 oh, it saves a fortune
734 00:27:41:29 00:27:43:07 it’s like but yeah, now|you have no customers
735 00:27:43:07 00:27:46:10 because no one’s got any|money to buy anything.
736 00:27:46:10 00:27:47:17 And then there’s also this kind of
737 00:27:47:17 00:27:48:28 conscious awareness from governments
738 00:27:48:28 00:27:52:20 that this fantastic|opportunity of technology
739 00:27:54:00 00:27:56:23 alleviating a lot of the|hard effort and labor
740 00:27:56:23 00:27:58:08 that we’ve had to do,
741 00:27:58:08 00:28:02:05 just like I’m no longer|walking three miles
742 00:28:02:05 00:28:04:23 to my local HEB to go|and get my groceries.
743 00:28:04:23 00:28:07:02 I jump in a car and it does it for me.
744 00:28:07:02 00:28:09:21 It’s like and I just sit behind the wheel.
745 00:28:09:21 00:28:10:16 It’s the same thing.
746 00:28:10:16 00:28:13:26 We just have to make sure that|we deliver this technology
747 00:28:13:26 00:28:15:19 in a responsible way.
748 00:28:17:15 00:28:20:02 Now, will that happen?
749 00:28:20:02 00:28:21:26 I think greed will, (laughs)
750 00:28:21:26 00:28:24:14 – I think that money and greed
751 00:28:24:14 00:28:28:15 and the quest for either|recruiter efficiency
752 00:28:28:15 00:28:32:00 or just overall corporate efficiency,
753 00:28:32:00 00:28:35:11 it could take a big toll|in the next 10 years.
754 00:28:35:11 00:28:37:15 – It might be some epic,
755 00:28:37:15 00:28:41:20 it might be even responsible|for another epic downturn.
756 00:28:41:20 00:28:43:19 And I know how that sounds.
757 00:28:43:19 00:28:45:14 – There’s already some|economic predictions
758 00:28:45:14 00:28:46:29 that by 2028, 2030
759 00:28:48:17 00:28:50:28 there’s gong to be a major|economic downturn anyway
760 00:28:50:28 00:28:51:28 that’s going to look,
761 00:28:51:28 00:28:53:14 make the last two depressions
762 00:28:53:14 00:28:55:04 even like cakewalks.
763 00:28:55:04 00:28:56:29 – Yeah, like a walk in the park.
764 00:28:56:29 00:28:59:29 And I kind of believe that.
765 00:28:59:29 00:29:02:08 I think that that might happen.
766 00:29:02:08 00:29:03:14 I think,
767 00:29:03:14 00:29:07:24 I would hope that we would|have the vision and foresight
768 00:29:07:24 00:29:10:22 to know that based on the|last economic downturns
769 00:29:10:22 00:29:13:06 that we might be a little|bit more controlling.
770 00:29:13:06 00:29:16:06 This is the problem as well.
771 00:29:16:06 00:29:17:03 When they say,
772 00:29:17:03 00:29:20:21 government intervention is|classed as economic failure
773 00:29:20:21 00:29:22:16 as one of the points in economics.
774 00:29:22:16 00:29:25:20 Free markets are meant|to be able to be free.
775 00:29:25:20 00:29:27:26 I do believe in free markets, obviously.
776 00:29:27:26 00:29:30:27 But I also believe that|we’ve dawned on a point
777 00:29:30:27 00:29:33:15 of technology and progression.
778 00:29:33:15 00:29:38:02 The economic principals|that were scratched out
779 00:29:38:02 00:29:41:27 during pin trading and stuff like that
780 00:29:41:27 00:29:43:23 is very different now.
781 00:29:43:23 00:29:45:03 So, I think there does need
782 00:29:45:03 00:29:46:10 to be a decent level of intervention
783 00:29:46:10 00:29:48:12 to make sure we’re all okay.
784 00:29:48:12 00:29:51:05 Because otherwise, if we|just let it go free market
785 00:29:51:05 00:29:55:10 and the natural disparity|between inequality,
786 00:29:55:10 00:29:57:02 AI and machines taking over
787 00:29:57:02 00:29:58:18 and just suddenly this|tiny group of people
788 00:29:58:18 00:30:01:05 have everything and|everybody else has nothing,
789 00:30:01:05 00:30:03:08 then we’ve kind of gone back|to a monarchy, haven’t we?
790 00:30:03:08 00:30:05:20 (laughs)
791 00:30:05:20 00:30:07:21 – Yeah, it should be interest to see.
792 00:30:07:21 00:30:11:05 I think there going to be some potential
793 00:30:11:05 00:30:12:19 social uprising about that.
794 00:30:12:19 00:30:15:00 And we’re seeing that in|some countries already.
795 00:30:15:00 00:30:16:20 And it’s not necessary due to AI
796 00:30:16:20 00:30:17:25 but just to
797 00:30:17:25 00:30:18:24 – General things.
798 00:30:18:24 00:30:23:03 – The disparity and|either financial freedom
799 00:30:23:03 00:30:24:22 or political freedoms
800 00:30:24:22 00:30:26:24 or just freedom of speech, whatever.
801 00:30:26:24 00:30:28:29 We’re certainly seeing that all over.
802 00:30:28:29 00:30:31:25 All right so, I’ve got|a little section here
803 00:30:31:25 00:30:34:18 that I like to call Nope and Yup.
804 00:30:36:12 00:30:38:07 You’re only allowed to answer the question
805 00:30:38:07 00:30:40:19 with either a nope or a yup.
806 00:30:40:19 00:30:43:12 No elaboration on the answer at all.
807 00:30:43:12 00:30:45:01 And we can discuss it further if you want
808 00:30:45:01 00:30:47:17 but not during this section, okay?
809 00:30:47:17 00:30:48:19 – Okay, fine.
810 00:30:48:19 00:30:50:12 – So, you’re only allowed|to answer with a yope,
811 00:30:50:12 00:30:51:15 a yope (laughs)
812 00:30:51:15 00:30:52:10 a nope or a yup.
813 00:30:52:10 00:30:53:21 – Nope or yup.
814 00:30:53:21 00:30:54:16 – Yes.
815 00:30:54:16 00:30:57:27 All right so, AI is still in its infancy
816 00:30:57:27 00:31:00:14 and basically at a kindergarten level.
817 00:31:00:14 00:31:02:03 – Yup.
818 00:31:02:03 00:31:04:11 – Once a task is learned by AI,
819 00:31:04:11 00:31:08:08 it can’t be unlearned|without rewriting the code.
820 00:31:08:08 00:31:09:03 – Nope.
821 00:31:10:11 00:31:12:12 – Some recruiters will lose their jobs
822 00:31:12:12 00:31:15:03 because of advancements in technology.
823 00:31:15:03 00:31:15:28 – Yup.
824 00:31:16:27 00:31:19:21 – The recruiting process|is massively inefficient
825 00:31:19:21 00:31:21:06 as it stand today.
826 00:31:24:04 00:31:24:29 – Yup.
827 00:31:26:05 00:31:28:17 – There are few credible use cases
828 00:31:28:17 00:31:31:16 for using blockchain in|the recruiting process.
829 00:31:31:16 00:31:32:21 – Nope.
830 00:31:32:21 00:31:33:16 – All right.
831 00:31:33:16 00:31:34:14 (both laugh)
832 00:31:34:14 00:31:36:08 Arran Stewart, thank you very much.
833 00:31:36:08 00:31:38:00 – That was a rhetorical|question, that last one.
834 00:31:38:00 00:31:40:01 (both laugh)
835 00:31:40:01 00:31:42:19 – I had to throw you a|softball in there, right?
836 00:31:42:19 00:31:43:27 (Arran laughs)
837 00:31:43:27 00:31:45:02 – Yeah, yeah.
838 00:31:45:02 00:31:47:05 – So, we’ve got a little|thing called After Hours.
839 00:31:47:05 00:31:48:06 Don’t hang up
840 00:31:48:06 00:31:50:00 but I’m going to end this portion of it
841 00:31:50:00 00:31:52:15 and then we’re going to|pick up in the After Hours.
842 00:31:52:15 00:31:54:01 Everybody, thank you so much.
843 00:31:54:01 00:31:55:06 Arran, thank you for joining me today.
844 00:31:55:06 00:31:56:01 – Thank you so much.
845 00:31:56:01 00:31:56:26 Thanks, Jeff.
846 00:31:56:26 00:31:57:21 Real pleasure to be on.
847 00:31:57:21 00:31:58:15 It’s been fantastic.
848 00:31:58:15 00:32:00:00 – All right, everybody,
849 00:32:00:00 00:32:01:13 check out the After Hours
850 00:32:01:13 00:32:03:16 where we’re going to|finish up the conversation
851 00:32:03:16 00:32:05:01 with Arran here.
852 00:32:05:01 00:32:05:26 Thanks everybody.
853 00:32:05:26 00:32:06:21 Make it a great one.|(rock music)
854 00:32:06:21 00:32:07:16 Bye-bye.
END