Brendan Cox 00:23
Data is a massive thing now and with eLearning moving in the direction it is going in, knowing why you’re doing something and being able to check if it’s working right is more important than ever. So we thought we’d speak to a data specialist.
Today on the podcast, we’re going to chat to Anja Hartleb-Parson and find out about what she does her background and the way she sees the industry going. So, hi Anja.
Anja Hartleb-Parson 01:04
Brendan Cox 01:05
How are you doing today?
Anja Hartleb-Parson 01:06
Brendan Cox 01:07
Good. So you’re over in the US at the moment?
Anja Hartleb-Parson 01:11
I’m in the Midwest. Close to Chicago. It’s very cold and snowy.
Brendan Cox 01:22
Everyone’s got problems with the weather at the moment. It’s been crazy.
So tell me a bit about yourself. How would you describe yourself as a data specialist?
Anja Hartleb-Parson 01:39
So I guess I’ll back up a little bit. I actually did not start in data, per se, until maybe about a few years ago, I originally graduated with degrees in philosophy and psychology and political science and Organisational Behaviour.
So, that’s a lot of education. I know. But what I was doing, originally was actually building startups and startup nonprofits to be precise. So I worked in a lot of different areas, which is sort of the nature of startups in general, you just have to have your hand on everything, because they’re just never enough resources.
You have to have a lot of hats on. Exactly.
The nice thing, apart from this just being very suitable to my generally very diverse brain and skill set. The nice thing is that you just learn a lot about different areas, in business or in organisations. And so what always intrigued me is the people, the people development piece.
That’s not just human resources, per se, but the training and development piece. I did a lot of training in my previous roles. And I do that right now as well.
Last year I wanted to start looking back into training roles into learning and development roles, because I was really maxing out my potential and what I’m doing right now, which I’ll explain in a minute, but the learning and development field in and of itself with a community was fairly new to me.
Even though I had done a lot of training and my other jobs, and as I was completing graduate education and data science, I poked a little bit into the data aspect of L&D and realised that I poked the bear there, it seems that data in particular and learning and development is part of the challenge for learning and development.
I thought to myself, well, I’ve been really fascinated with data for a while. So that’s been the pivot that I’ve been trying to make professionally out of what I’m doing right now, which is mostly Professional Services and in the financial realm. So that’s the short background of that.
Brendan Cox 05:12
Okay, with what you do in the life financial realm, it’s a lot more data-driven at this point. There are a lot more established roles for data analysis in that?
Anja Hartleb-Parson 05:28
. So in the area where I am, which is mostly sales support, and programme support, the data is focused primarily on sales efforts and marketing efforts. Then, of course, lead generation, etc. So those areas, in general, are fairly well established when it comes to data, for obvious reasons.
But the training area, which I also do, I train financial advisors, and they train banking staff, and on the flip side really doesn’t have very good data. I would run into these issues of training people, and not really knowing whether what I’m doing is really taken hold – that the methods that I’m using, or what really the impact is, you can see it a little bit in terms of is this advisor getting up to speed and generating new business. But, it’s a grey area.
Brendan Cox 06:58
They say ‘great course, thanks for that, that was really entertaining’
Then you don’t get to find out how much difference is actually tangibly being made? The funny this is, what you’ll have done might have made a massive difference. But the proof is not there. So it’s always a bit of a guess as to how much impact you’ve got at the end of the day.
Anja Hartleb-Parson 07:23
It’s there. But it’s also the fact that training is seen as something that must be done. And it’s tangential, and it’s not really seen as a strategic piece. So when it comes to wanting to improve data collection and data analysis, that’s not where the resources go.
Brendan Cox 07:50
Okay, we did notice a lot of work from our side of it, when we were exploring things like compliance. More often than not, people are just trying to mitigate risk. So what they want to do is do the bare minimum to make sure they don’t get in trouble.
They’re not really going to want to spend more money than they have to. They just want to get the job done in the cheapest way possible. Whereas some other types of E-learning, everyone’s a bit more interested in making something creative out of it, but with the compliance stuff, it’s always what’s the lowest you do on this? To get us past the posts? So I can imagine them being open to spending more time analysing the data, they just say ‘no, we don’t need that.’
Anja Hartleb-Parson 08:39
That is very true, the compliant piece is massive in the financial industry. A lot of the course learning you might say that takes place is centered around compliance, and a lot of it is extremely tedious and very poorly designed. When I asked the question, well, what impact is this actually having on compliance? People shrug their shoulders and we just check completion rates because that’s basically what we need to report.
Brendan Cox 09:21
Was the subject matter itself one of the reasons that you wanted to change or was it more like a new pond that’s got not as many fish in it?
Anja Hartleb-Parson 09:36
Well, both of those things. People development has always been a very big part and a professional passion for me. And combining that with data, which is also something that I’m extremely passionate about, just from an evidence point of view.
In other words, do we actually have evidence for what we’re doing? So that’s always been big with me. Figuring out that L&D is struggling with data was a fortuitous event, I didn’t realise that.
We spend a lot of money on training, we should be having good data on whether that’s a good return on investment. So I was a little surprised, speaking to a lot of people with way more experience in the L&D space than I have, complaining about this. Also getting really excited that someone comes in, and hey, can I help you out with data?
Brendan Cox 10:55
Totally. From my background, in animation, there’s the same sort of thing. We make a tonne of content and it just gets sort of burped out onto the internet and disappears. It’s amazing how much money is spent on content that has absolutely no metrics attached to it. You spent all this money. What was the point?
I think that was one of the things that we found about eLearning, the learning and training sector has got so much room to grow – so much enthusiasm for embracing, let’s get cracking, especially with change with COVID
Working remotely and all this stuff, maybe it’s going to be a bit of a renaissance with the business side of things. So having better quality design, better analytics, all this stuff that’s traditionally in the business sector is now coming into the learning. There is lots of potential.
Anja Hartleb-Parson 11:58
There is definitely a lot of potential. But I do see the movement to be fairly slow. So someone like me, who goes on the job boards and looks for data analyst roles in L&D is not coming up with a whole lot.
Organisations recognise that they need to expand that capability. But they’re not really putting resources behind it quite yet. Or they do have the thing where we want it, we’re going to hire instructional designers and learning and development specialists who are also really good at data.
I’m not sure that’s going that well. It’s just for one thing, data in itself is usually a full-time job anyhow, but a lot of people go into L&D because they want to create learning and that they’re not wanting to know whether their learning actually has an impact. Most of the time. Beyond the usual, oh, I enjoyed this course evaluation.
But getting good at data analysis and figuring out a good way to approach and analyse data is a skill set in and of itself that requires experience.
Brendan Cox 13:48
Especially with things like instructional design, it’s quite a fluffy term that could include almost anything to the point where oh, , and a bit of data. And a bit of this, oh, and a bit of development, can you put some augmented reality and VR in there as well.
That mistake of, as you said, just tacking it on as a little side skill is not really making the most of it. In the beginning, you have a goal that you want to do. And then you do the whole project. You never come back to work out if the goal was achieved.
Anja Hartleb-Parson 14:48
Maybe if you want to ask some more of your questions, that’d be fine. But I do have my own thoughts on that?
Brendan Cox 15:02
Go ahead. So how do you feel about it all?
Anja Hartleb-Parson 15:16
So I don’t want to step on anybody’s toes. But what I’m seeing from the outside is, the first problem that I see is a fixation on tools. And I find that it’s not uncommon generally in most business domains. But even if you’re looking at instructional design or L&D specialist job descriptions, you must be proficient in X, Y, and Z authoring tools, and this LMS or LSP, as they’re now called.
It’s the latest fashions when it comes to learning design. And that focus in and of itself, to me is a bit problematic, because the tool is just the tool. The question is, what do you want to achieve with it? Are you achieving that, and that’s really where you need good data. And speaking to LMS in particular, there are loads of vendors out there. Of course, they’re marketing themselves and talking a good game.
But as an L&D department or a training department, if you don’t have a clear data strategy, you’re just looking at it as a shiny object, and oh, look, they can say they can track this, that and the other thing, the question is, is that useful for you given that you want to actually create an impact on business goals on KPIs, etc, with your training.
So I guess it’s nice to know that your trainees are watching videos, 95 to 100%, all the way through, but that doesn’t tell you anything about whether they’ve learned. And so, I’m going to be blunt, this tool fetish is a little disconcerting to me, but it’s something that a lot of business domains struggle with.
The other piece of that is that if, and I think this might be the more fundamental question is how you’re perceiving L&D and training in the organisation in and of itself. I find, at least from what people are telling me that a lot of times, it’s still viewed as a cost centre, as opposed to an investment centre.
So leadership is looking at L&D as a cost, and they want to contain costs. And so if you’re looking to get more resources, that is your first barrier, so to speak, to convince people that this is a good thing to spend money on. But then you’re also not reporting the things that might influence leaders to look at what you’re doing, it’s more of an investment.
So if you’re not really able to report impact on business goals, or in KPIs key metrics that the business is interested in, your seat at the table is going to be a lot smaller. So a lot of this follows on to perspectives within the business on L&D and what role data should be playing in particular.
Brendan Cox 19:19
What stage would you traditionally start being involved in a project? Do you start as early enough in the project as you think you should be?
Anja Hartleb-Parson 19:41
That’s a really good question. As I said, I’m trying to pivot into these roles. Right now I’m doing a lot of free advising. I find that usually the strategy for data collection, that piece isn’t established early enough. So when you start with a learning project, and you’re looking at, okay, what is actually the problem that we’re trying to solve, or the need that we’re trying to address? You’re already falling short, sometimes on data there.
So it’s like some manager from a department comes to you, and says, Well, we need a course on this. And then you start poking a little bit and wonder and ask, why do you need this course? What do you think? Well, our people are not strong on this particular piece. It might be a soft skill, for instance, or it might be even something harder skill-related, like in sales, they’re not hitting their numbers, for instance. Or they’re not using the new CRM system efficiently, or whatever it is.
And so then you ask well, okay, so how are you seeing these things? What makes you think that these are the problems? You get opinions and not hard data. And so that’s a starting point, that’s already problematic.
Brendan Cox 21:40
You are already going off course before you even start.
Anja Hartleb-Parson 21:44
As a lot of seasoned instructional design designers know. The problem that is being asked of you to solve isn’t always the problem that actually needs to be solved.
If you don’t really have good data, then getting to that actual problem that needs to be solved, it’s going to be a lot harder. Sometimes it’s an issue that the department isn’t really good at tracking data, or the right data anyway, or people have preconceived notions and their own assumptions about what the problem is. And so that’s what they tell you. So the starting point is often an issue already.
Brendan Cox 22:45
So do you find that soft skills play quite a role in actually getting them to bring in data during analysis at the right time as well, as well as the soft skills involved in extracting that data and putting it across? Would you say there’s an overlap between analysis and soft skills? Because you’ve got the psychology of the people you’re trying to analyse, but then the psychology of the people that you’re analysing it for? You’re educating the client about the importance of data at the same time as using the analysis as the job if you know what? Well, I was wondering, how do you see this side of it? The Psychology part?
Anja Hartleb-Parson 23:51
So I try to do my best not to speculate too much about what’s going on in people’s heads. You can’t really escape that sometimes. I do find that. And this is just my experience. A lot of people look at training or learning as a goal in and of itself. I see this more in higher education than in business, but generally, when you have that approach, there is an intrinsic value to learning and I’m not disputing that there isn’t, but when you approach it from the perspective of learning in and of itself is good – It doesn’t necessarily need to be justified with data.
It generally leads to activities that aren’t measured in terms of return on investment, and are not measured well enough. At the very least inefficient approaches to learning. That mindset sometimes is you need to uncover that and combat it a little bit, it’s easier to do that in business because businesses generally are concerned about the bottom line and the cost of things.
I’m not saying that institutions of higher learning are not but things are a little bit slower in higher education and government.
You also have the fact that you’ve got, somewhat at least from a perception point of view, unlimited coffers, because we can always tap the taxpayer. That’s not the greatest basis for developing effective learning and training. That’s my opinion. I think the thing is that education has everyone in it because they want to help people and share knowledge.
Sometimes they focus purely on that, they do miss setting goals properly. The goals mean that it’s difficult to get data. We feel like we’re doing a good thing. So you don’t measure it as much. In terms of your interests, psychology and your background?
Brendan Cox 26:00
Would you apply similar data analysis to softer skills training as opposed to something like sales training?
Anja Hartleb-Parson 27:13
That’s a really good question. And it is definitely a big topic, because soft skills are much harder to measure, in many cases anyway. But I think the approach there that I find more helpful is to not let the perfect be the enemy of the good. I’m just going to give an example.
So let’s just say we’re looking at our employees experience and job satisfaction. And we recognise that leadership, the role that the department managers play is pretty important. We have pretty good research from many decades as to what leadership skills are generally effective. So if we do surveys of our employees of what their job satisfaction is, or we go more specifically into if they think their department leaders are good leaders, then we take that as a baseline.
If we’re seeing results that are not very encouraging, or definitely room for improvement. In other words, job satisfaction is fairly low or too low anyway, then we develop training based on the leadership skills we think are lacking. This is where you establish your baseline through surveys, you would be looking at how employees are assessing these particular skills in their leaders.
And of course, yes, this isn’t pure objectivity. It is certainly subjective. But nevertheless, to have the baseline, then you Institute training, you can then again, perform the same measurements and see if the needle has moved.
While these qualitative assessments like surveys are certainly more subjective, and it’s not the same as quantitative, qualitative data is still good to do that, I think provided that the tools you’re using for that are very well developed.
We all know the pitfalls of surveys and survey design. And so that’s where you have good data people. Let’s just call them instrumentalist. People who have experienced developing qualitative instruments. So you can then see training seems to be moving the needle and this particular aspect, or it’s not moving the needle.
Granted, you can’t attribute 100% of that movement to training. Because there are always other factors. But that’s not what you’re trying to do anyway. It wouldn’t really be a good use your time to do that. So what you’re doing when you communicate your results is hedging and saying, look, this is what we’re seeing, there is movement, and we think that training has definitely played a role in that. And that way you can justify, or at least you can tell whether you’ve designed good training or not.
So that’s the way I would approach soft skills in general. I guess it’s like you’re measuring, you’re moving the needle, but you have less of this, there’s not a dial of numbers on the go around it, but at least you’re getting a sense of moving in the right direction.
I think it’s also good to remember that’s often the case with hard skills too, sales training is not going to be the 100% effect on the sales numbers, either. There are other factors that always contribute to the numbers that you get, as a salesperson – any salesperson can tell you that. It’s not just the training that matters for them, whether they meet their goals. So even when we’re measuring these hard things we always need to be mindful that training isn’t the only variable.
Brendan Cox 32:47
So everyone’s starting to realise data plays a bit more of a role in the learning process. What’s your predictions for the future? You said, it’s moving, but moving slowly. What things are you seeing?
Anja Hartleb-Parson 33:06
So what I’m seeing right now is something Ive been monitoring a little bit for the last decade or so. When you look at human resources in particular, human resources is getting better on the data piece. It’s not where sales and marketing is or supply chain, or these big, big business areas, but there is definitely movement in the right direction.
Human resources in particular is a head of training and development, I would say. So I think this depends on organisations, sometimes training is really integrated into human resources, sometimes it’s not as much integrated. There may be data sharing issues there. Some of the time. So once that improves, I think we are looking at human resources as more of a strategic capability and an investment.
And again, I said that mindset has changed. I think definitely over the last decade or so, from what I’ve seen, and we apply that same reasoning to training and development in particular, the movement will hopefully be faster.
I think with a lot of the technology being developed right now, I do fear a little bit that you are applying bulldozers to things where you haven’t even managed to shovel a hole. So we need to be careful with that.
Everyone is looking at AI as the next thing. Someone who’s trained in machine learning algorithms and AI, I can tell you it’s not a panacea. There are definitely a lot of things that need to be carefully worked out before you can more generally apply AI to various data issues. But technology in and of itself is moving pretty pretty quickly as it always is.
I hope that that is something that L&D will further take advantage of to really develop the data piece in and of itself. But it all depends on whether L&D in general says we actually need data analysts on our team, not jus that person that we borrowed from two doors down, and who has two hours to devote once a month. Because they also have a regular job.
Brendan Cox 36:39
So I guess that’s the main challenge now, educating the educators of the importance of investing in data early?
Anja Hartleb-Parson 36:50
Like we’ve been doing in human resources, where we’re hiring people on the data side and human resources. I see much more, just looking at the job market, much more movement there. So there are certainly, if you wanted to get into data and the people development space, there are many more opportunities in HR than in training in particular. So that’s what I learned anyways. Take one step into it, and then pivot from there again, possibly. Which is a little bit frustrating for people like me, who are very impatient and just want to solve the problems they want to solve.
Brendan Cox 37:45
I totally empathise. Tom are the same. We basically have pivoted multiple times in the last year, and it’s just a massive landscape for learning. We’re realising that maybe compliance isn’t the place for us, maybe, actually storytelling needs to be more of a thing. And then we’re finding out analysis at the beginning is actually what we’re good at. And so , I totally get it. We’re super impatient as well. So it’s that thing of you want to find your perfect sweet spot in the learning community.
Anja Hartleb-Parson 38:19
So there are certainly people I would say, recognised voices in the community, they’ve been pounding the data drum for quite a while. A lot of them I see actually, they’re independent consultants. So they work with clients, and I don’t see too many people who have a full-time job within an organisation. But the folks that I’ve talked to seem to, a lot of times, have gotten into it and they basically had to start from scratch. That’s what happens when you’re trying to get a new capability launched or made more visible within an organisation.
So I also talked to people who are trained, seasoned instructional designers, are learning developers, and so on, so forth. And when it comes to data, we’re just reporting the level one and two, if we’re taking the Kirkpatrick Patrick model, and they’re used to not really being asked for much more, they might be interested in reporting more or looking at data more deeply, but It’s if you have time in between projects, and of course, that never happens. , it’s on the ‘would like to have someday in the distant future’ list in many cases.
Brendan Cox 40:20
I think it’s going to be interesting. It’s a bit like the Wild West at the moment. But once everyone’s starts to find their footing with everything, I think they’re going to say, actually, we should take this seriously. And like you say, the people that are banging the drum already. There’s going to be more and more proof of how valuable it is. And then bit by bit. It’s going to be interesting.
Anja Hartleb-Parson 40:47
I’m not sure if this is true, but just thinking about it logically. The larger organisations might be leading the charge there. And that’s probably purely a function of how much they spend on training and development. If you have a multinational with 50,000 people, you’re spending millions and millions of dollars every year on training. So that’s definitely something.
But then again there are certain companies that have accepted that that’s just what they spend. So I don’t want to say that that’s necessarily true across the board. But smaller organisations, you would think that having to be scrappier would turn the focus more on return on investment, and I want to say, return on investment sounds so cold and calculating. But it’s not just saying, Okay, let’s look at what we spend in what we brought in, and then take the difference. Return on investment is more than that. It’s a richer concept.
So it’s not that easy to measure when you’re looking at soft skills, for instance, you do end up making some estimates about what the actual dollar impact there is. So it’s not a perfect measure, like measuring happiness for people. It’s a bit tricky. But then there are things like even if you’re measuring your staff turnover, it’s not going to be 100% down to it. But if you’ve put the effort into improving the soft skills of the management, and recording their form,
Turnover, retention, those are definitely things to look at, I think we have a pretty good establishment of engagement as a measure. And we know that engagement impacts productivity, and it impacts employee experience. And so even looking at that, from a return on investment perspective, makes sense to me.
You look at something like sick days, just the number of days that people are not at work, but you’re still paying them. It’s a reasonable assumption, that when you have a lot of people taking a lot of sick time, it’s not that they’re all sick, that maybe they’re burnt out, not really enjoying what they’re doing. What things are already being measured in other departments, and how can we reasonably tie training to that? So the exchange of data, etc. is really important so that your L&D department isn’t starting from scratch. Sometimes that’s really part of the problem, you exist in a vacuum when it comes to data and your marketing and sales area. Or your human resources, folks are not really sharing data. That’s not a good place to be.
Brendan Cox 45:12
I suppose the bigger a company gets, the more communication internally becomes a problem. What one person knows is common knowledge isn’t someone else’s.
Anja Hartleb-Parson 45:23
So as an undergrad, I studied philosophy and psychology. And then in graduate school, I went to focus on political theory, political philosophy and biopolitics. And the biopolitics area is a hybrid field of philosophy, psychology, anthropology, sociology, neuroscience, biology.
It’s literally everything thrown together to look at human behaviour, and how it has evolved over time. So that really shaped the way I was starting to think about why humans are doing the things that they’re doing, or at least trying to analyse those things. Out of those things come things like behavioural economics, for instance. And it’s just absolutely fascinating to me, as we look at those things over time, but at the same time, we are also seeing that human nature is quite enduring, in that it doesn’t really change much over, our circumstances change, right.
We have, for instance, more access to technology, but the principles of what drives human nature are pretty fixed and not very malleable. I’m not a behaviorist in the BF Skinner sense, because I think that’s a little myopic from an analysis perspective. But I do think that this behavioral piece is really important to understand where it’s coming from, and what is driving it. Otherwise, you’re just trying to change people and effect changes in them that are next to impossible to change. If you want to talk political, we know that engineering human behavior leads to very disastrous results.
That’s how I got into the Organisational Behaviour piece. And from a data perspective, that’s definitely something you can measure more easily, but you do need to understand why you are measuring it, and what it actually tells you in the end, so in other words, your interpretation is just this map. And the limits of your interpretation are just as important as the approach you’re taking to measurement.
Brendan Cox 48:51
Some people like to look up at the sky and imagine the universe and some people look up at the sky and get freaked out that maybe aliens are gonna come and get them and they just put the head back down. There’s that thing with humans, we get stuck in the moment, we don’t step back from stuff. And I suppose the data side of it is stepping back and seeing the bigger picture. And not reacting to things, instead actually observing them properly. Like a scientific approach.
Anja Hartleb-Parson 49:26
It is that, but it’s also hard. I don’t like doing hard things. How many people are making new year’s resolutions to actually start exercising and eating right? So when we need to do hard things even though we know we need to do them. That doesn’t mean that we’re not motivated enough to do them. That’s, that’s a big piece of the puzzle.
Human motivation in general. The ancient Greeks have already wondered that, and I’m not sure how far they got. It’s very complex. We’re dealing with human beings. And granted training and development has recognised that learning is more than just pumping knowledge into people. That’s why it’s a complex thing to measure. But that’s where I was saying earlier, don’t let the perfect be the enemy of the good. You can do some things, and you should do some things in terms of measurement. And as long as you have the caveat of, well, we do realise there are other variables, and we can’t measure every possible variable in this to ascertain the impact of our training programs.
But we’re doing something, and we’re getting some idea of whether it’s effective or not effective, or at least where we need to make improvements. It’s how you set the goals. Right? they should be ambitious, of course, but they shouldn’t be unreachable.
Brendan Cox 51:23
It’s that smart goals thing. It needs to have a logic to it and not just say, oh, we’re just going to do it, it’ll be great. Okay but there’s no way that we could do this, or there’s no way we can measure this, or there’s no way that it actually helps, it’s funny how much we get enthusiastic about the ideas of things and forget to actually make sure they work.
Anja Hartleb-Parson 51:50
I think it’s good to have big ideas. There’s no doubt about that. But I do think sometimes we get stuck in ‘if we can’t do that, then we won’t do anything’.
I’m not sure that that’s always the best way to look at it either. So behaviour tells you some things, it doesn’t tell you the whole story, of course. And that goes back to the point I was making about quantitative versus qualitative. That we do need both, and we should value both types of analysis. Anyway, it’s just all very fascinating to me at the end of the day, and I just get very nerded out about it. But I also get very frustrated about it as well, because I’m passionate about it.
Brendan Cox 53:01
I think knowledge is also a burden. Because the more you analyse stuff, the more holes that you see in things. The further you sit back from things, you can see the errors. It is frustrating because you want to be able to fix everything, but you can’t. The general idea is as long as you’re going in the right direction, you can be happy with what you do. Being too self-aware is almost a burden.
Anja Hartleb-Parson 53:31
It’s the curse of knowledge.
Brendan Cox 53:37
I always paraphrase it completely wonky. But the gist of it – Knowledge is a burden.
Anja Hartleb-Parson 53:43
It makes sense. That’s why we should be looking at people who don’t necessarily have L&D experience in particular. And I know that’s not how the job market works. But when you have people who are coming from different areas, and from very different perspectives look at what you’re doing, they might be able to tell you this doesn’t make as much sense, as you think it does. Of course, you would be saying it way more diplomatically than that.
Brendan Cox 54:27
Totally. it’s like Silicon Valley, when they’ve got the Guru’s come in to help the CEOs and they do it from a completely different perspective, or they get the CEOs to go off on a retreat where they do our Ayahuasca and have a trip that then balances out the business side of them. But that’s what I find really interesting about what you’re doing because you’re basically driving to understand people, but you’re coming at it from a behavioural side, but then also from the truth from the data side as well. It balances out the lenses that you’re looking at it from.
Anja Hartleb-Parson 55:11
Another way of putting it would be when we’re looking at the behavioural piece, we’re looking at the behavioural piece, but we also know what it’s not telling us, or we, at least we should know, and be aware, and have some humility about what it’s not telling us. So the same goes for data.
We look at data and it’s telling us something, but it’s also not telling us something, and we need to be very aware of that. The opposite of people being too data driven is also a problem. And I haven’t seen that problem in L&D. Yet, maybe it’ll be a mark of progress if we ever get there.
Brendan Cox 56:00
Humans have a tendency to swing out like a pendulum. So if we’re not going to do anything on data, now suddenly we’re going to analyse everything. And then no, that was too much. And then eventually, we slowly swing back and forth. It lands in the middle. But if we end up having too much data in L&D, then that at least we’re going in the right direction.
Anja Hartleb-Parson 56:25
That’s true. I think that’s the part when you’re hiring people in data analysis that you need to be very aware of. So I think we tried to talk about this earlier. But then I got off on a tangent, which is the communication piece that you mentioned.
Data analysts can be extremely skilled and highly technical. But that’s not going to help you very much if they can’t communicate to you what the bottom line is? In other words what interpretation we should take from that and what we shouldn’t take from that.
So I think that’s a soft skill for data analysts – they definitely need to develop if they haven’t already anyway, or that also needs to be taught when in data analyst programmes. I luckily developed that skill on my own. But I will say that in the data science programme that I just finishing up, the focus hasn’t been so much on communication.
So I worry sometimes that when you’re hiring people who are very technically talented and skilled, how that’s going to affect you in terms of what you can take away from the data. And particularly, communicate to your stakeholders who often don’t have anywhere near that technical knowledge, but will often also lack more basic data analysis or data literacy skills, unfortunately.
Brendan Cox 58:39
I think that’s the thing, the more technical the role gets, the more jargon, the more analytical a person is, but they’re analytical for themselves. They’re not talking about it in the sense of a regular person. They are used to discussing it at an expert level. And like you say, that is definitely a really important skill, to basically work as a translator.
Because that’s one of the lovely things about really good scientists, like Bill Nye, the Science Guy, is that his art is in being able to explain something really, really well. So you don’t feel like you’re stupid. But you can actually learn it from him, he can translate in a way where it’s layman’s terms and you learn without being talked down to or being ‘jargoned’.
Anja Hartleb-Parson 59:51
Exactly. I think L&D people have an advantage because they teach. Hopefully, they have acquired the skill to teach well over time, and break complex things down into bite-sized digestible pieces. So when I look at job descriptions for data analysts or data scientists and they say you must write x y & z algorithms or be able to write those types of algorithms. To me, I think about it from the perspective of – well that you can learn fairly quickly.
By comparison, the soft skill of effective communication takes a lot longer. And it also requires a certain attitude. And I think attitude can’t be taught. So when you’re hiring people, you want to look at getting that right balance, and it might not be the best idea to hire the most technically skilled person. But you get someone who is very teachable, or who has had the ability to gain knowledge very quickly, and who is comfortable with gaining learning and knowledge. But who also has a good repertoire of the necessary soft skills.
Brendan Cox 1:01:36
That thing we said a few times before – people promote people to the level of their incompetence – the Peter Principle. Everyone’s really technical. And then because they’re technical, you make them the manager of the technical department, but you don’t train them in how to manage, which is a whole different skill set.
Anja Hartleb-Parson 1:02:02
Or you do train them, or you offer training and it just doesn’t work out very well, for various reasons, because this person just doesn’t have the right attitude. And they’d much rather be still the coder. But somehow leadership has convinced them that if you want to make more money then you really should be acquiescing to our demands of becoming a manager. And so there are certainly these types of internal pushes that happen.
Brendan Cox 1:02:45
There’s this assumption that the direction is always up. And that it naturally just keeps going up. That’s how you get more successful. Especially for creatives and technical roles. As a designer It’s in the back of our head, we’ve all been conditioned to think that okay, we have been a designer, now we’re head designer, at a certain point, we become successful enough that we should now open a studio and be in charge of a design studio. And then that’s the pinnacle of our thing. And then we win loads of awards.
But running a design studio is not even slightly the same actually being a designer. In fact, I don’t think anyone that I’ve really spoken to that is running any studio, that’s not just a two-person group. They don’t even get to do any of the day-to-day design anymore. So it’s a weird mentality we’ve all got it in us, we all think that you just keep going. And then you become a manager? And actually, it’s a whole different set of skills.
Anja Hartleb-Parson 1:03:51
It’s this skill set that’s really necessary to keep your team of employees happy and producing well, because crap managers, have a really negative effect on the bottom line, when it comes to the employee experience and subsequent productivity and retention.
Brendan Cox 1:04:28
Anja Hartleb-Parson 1:04:34
This is also just one of those things when it comes to people development that always baffles me a little bit. BecauseI always think certain things are common sense. And it turns out they’re actually not common sense.
Brendan Cox 1:04:59
There is no such thing as common sense. That would be a good name for the podcast. There’s no such thing as common sense with Brendan Cox.
Anja Hartleb-Parson 1:05:13
We shouldn’t assume it. It’s such a fascinating thing to me Organisational Behaviour. That the impacts are so vast, and sometimes the things that we need to do, at least appear on the surface to be so simple. But we just don’t do them.
Brendan Cox 1:05:46
Seeing the bigger picture has always been a problem with everything from the environment, to society to general personal well being.
Anja Hartleb-Parson 1:06:01
So the bio politics view there is that we oftentimes need things to touch us personally to affect us personally, before we deem it important enough to act on like, COVID. So that happens a lot. I think that and in group out group thinking or this idea of tribal mindsets, it’s still as much as we like to consider ourselves advanced, it’s still very much part of our human experience.
I don’t look at it from the perspective that we need to stamp it out. But I think that what you were saying about self awareness is just the critical piece that needs to happen. And that’s very hard. being objective with yourself is extremely difficult, because we don’t like to admit our own failures. And we don’t like to think about having done something wrong, having made mistakes, or having even believed in something that’s not correct, or that’s not right. Because we invest energy in that. And that goes back to your point about conserving energy. So the sunk cost fallacy is very real and human beings, which is that idea that we keep doing something, or we keep a software around, that’s not working properly, because we think, well, we invested all this money in it, or we invested all this time in it. It’s like that person who goes to the movies, and in 20 minutes, and they realise this is a crap movie. I guess I’m gonna have to sit here because I already paid for the ticket.
Brendan Cox 1:08:09
I did that with catwoman.
Anja Hartleb-Parson 1:08:14
I did not see that movie, but I don’t think a lot did. Anyway, so that’s the thing that we need to combat on a daily basis, almost.
Brendan Cox 1:08:31
So start looking at the data and don’t take the easy way out.
Anja Hartleb-Parson 1:08:38
That’s also the thing that affects data, though, right, which is where I’m going back to this point about interpretation. Data itself is neutral, but what we collect in terms of data and how we interpret it, and what we decide to focus on, that’s not neutral at all. It’s not value-neutral at all. It’s very much influenced by our preferences and values and biases. So don’t focus so much on getting the most technical person, focus on a person who has the ability to approach data and interpret data in a meaningful way.
Brendan Cox 1:09:34
Great. So where can someone find you online, if they want to chat with you and find out more about what you do?
Anja Hartleb-Parson 1:09:45
I am on LinkedIn. And that’s pretty much the only place you can find me right now because I try to really curtail my social media to be as productive as I can with it. But I do love LinkedIn. And definitely reach out to me there, send me a message, I am always open to connecting with new people.
Brendan Cox 1:10:17
Cool. Well, it’s really nice chatting with you. Really interesting to get your insight on it all. And the data side of things. I’ve always been obsessed with process improvement, but I’ve never really done it for other people’s businesses. I’ve always done it for my own freelancing stuff and building templates and keeping records of everything on the side to work out what was quicker and stuff like that. So chatting to someone who really knows what they’re talking about on a grander scale and from the business side of things is really interesting.
Anja Hartleb-Parson 1:10:50
I think it’s really important for your career development also, too. Because you want to be able to report some numbers-based achievements when you’re working with clients or possibly potential clients or when you’re trying to find a new job. I think that’s always really valuable to show how you’ve actually moved the needle on things.
Brendan Cox 1:11:21
Totally. I think that’s a really good point. Great, well, lovely chatting, and thanks very much.
Anja Hartleb-Parson 1:11:28
Thank you for having me.
Thanks for listening to the blend podcast. It’s available on Spotify, Google and Apple. You can find blend interactive content on LinkedIn, or www.blend.training. Don’t forget to like and subscribe. See you next time.