Introduction
Bradley Anderson is currently a Sr. Principal Software Developer, Data Scientist, and Strategy Consultant. His role encompasses developing and integrating software in order to solve logistical and operational challenges for the Federal Government. While his work is classified, he was able to disclose that his current area of focus is managing logistical challenges related to the return to office mandate for Federal Agencies. There are operational challenges presented by thousands of employees returning to their offices, one of which is the challenge of ensuring there are enough seats available for employees who are returning to work. The intent of this interview was to understand Bradley’s understanding of the role data plays in large enterprises and how he’s seen organizations successfully implement data transformations.
Bradley is currently a software engineer and Data Scientist, but his career actually started in International Policy. He majored in International relations at BYU. Shortly afterwards, he built a resume of distinguished positions in international economic policy. Some of his experiences included interning at various international NGOs and developing economic policy for the European Union. After his internship with the EU, he continued contributing to international economic policy at the World Trade Center, specifically analyzing the Trans-Pacific Partnership. Prior to his departure from international economic policy, Bradley contributed to the Center for Strategic International Studies (CSIS) in Washington DC, with a focus on China and Japan relations.
During his work with CSIS, he saw that it was statistics and econometrics that had the greatest influence on decision-making. He saw that it was the individuals who used data to tell stories who were taken most seriously and most likely to have their recommendations implemented. It was at this point that he discovered the “quantitative argument” and began harnessing the power of data to explain the policy narratives he was analyzing. He took the time to learn R, Python, and Tableau, and began implementing these technologies to augment the policy work he was involved with.
His work at this point had become more technical, but he continued to blend his policy expertise with software engineering as a government consultant. He made his entrance into software engineering as a Cost Analyst with the Department of Defense. He then held two sequential consulting positions at Deloitte Consulting and Booz Allen Hamilton where he continued to expand his software engineering skill set. In these roles, he analyzed processes for the Air Force division and worked alongside senior leadership in strategic roadmap development and consulting. Working closely with leadership, Bradley learned that data is the language of decision-making and that “data is the new bacon”.
Interview
What is your current role? And what projects are you currently working on?
My current title is Senior Principal Software Developer/ Data Scientist/ Strategy Consultant. Essentially that means that I field requirements and translate them into technical solutions with the client. So that means, projects range pretty widely from solving physical and digital logistics questions to virtually any other challenge that can be solved with analytics and software.
For instance, one of my clients is facing a classic logistical challenge; that is, getting butts and seats. It’s a classic problem of having too many butts and not enough seats for them to sit in. I also work with my clients on strategic technology integrations. There are many Excel Ninjas who are able to input data fairly quickly into spreadsheets. However, there are limitations with this.
What I do is I help integrate the latest greatest, critical and emerging technologies to help solve problems like that. Don’t get me wrong, I like Excel, it makes the world go round and is a valuable platform. But being stuck on it sometimes causes my clients to miss out on opportunities that could greatly improve their operations. This includes basic opportunities, like simple automation pipelines. But it also includes transformational opportunities, such as AI technology.
In a nutshell, I leverage various technologies in order to redirect humans from squinting at spreadsheets towards being able to tell a story and, in my case, help them identify the right policy and do better data collection for better insights.
How do you think about the use cases and opportunities of data?
What makes data valuable are it’s attributes and the functions of those who manage it, the roles of those individuals. The roles that I see are Data Scientists, Data Engineers and Data Analysts. And frankly, I don’t think the industry knows how to discern those roles very well, even still.
Roles
I hear people talk about Data Scientists as though they’re Economists or Statisticians. But I also hear the industry using the same terms for Data Engineers when they discuss the need for a data architecture. Then you’ve got Data Analysts. Analysts craft stories from data, and maybe don’t do as much back-end data manipulation. I tend to find that the intent of senior leadership is often data collection and data manipulation in order to build confidence in a strategic narrative. This is important, because they need high degrees of confidence when constructing high-impact decisions. Leaders need to make choices and now they’re in a world where they can leverage a lot more tools to speak with significantly greater confidence than they had before.
In a way, leaders can de-risk their decision-making process by using data. This is because with data-driven decision-making, it is the data and methodology that drive the narrative, rather than the opinion or intuition of the highest paid person in the room.
“Leaders need to make choices and now they’re in a world where they can leverage a lot more tools to speak with significantly greater confidence than they had before.”
Attributes of Data
The use cases of data is a massive topic, but analyzing the attributes of data can help leaders assess the value of individual data initiatives. The ultimate goal of the data initiatives I oversee is to increase leadership’s confidence in the data we use to drive our strategic decisions. We improve confidence in the data we produce by improving the attributes of our data. These attributes include accuracy, completeness, consistency, timeliness, validity, reliability, uniqueness, relevance and accessibility. You can bucket data initiatives in terms of data attributes, both as individual attributes or groups of them. There are probably other attributes and the list is constantly growing, but these attributes are foundational and each contribute to the overarching goal of increasing leadership’s confidence in our data and reporting.
Achieving these basic attributes of data is essential before there can be any real data storytelling or analysis. The reality is that integrating technology is often is the most valuable activity and just optimizing day-to-day data aggregation and reporting is the stuff that analysts and decision makers are dependent on. There are attractive and unattractive aspects of data. And arguably, the most valuable use cases fall within the unattractive aspects of data. The data game is very interesting and you get to rub shoulders with the highest of the highs, but also the lowest of the lows.
Just to name a few attributes I see and how they relate to common use cases; we want accurate information so we know that the right systems are patched. We want consistent data when monitoring social media posts. This might mean being able to see period-over-period results, and whether or not someone posted about Taylor Swift. We want completeness for knowing whose name goes to which employee ID when sending out paychecks.
When I think of common data use cases, I tend to first think of the automation piece and real-time reporting. Automation requires timeliness. For instance, if you want to know whether or not revenue is increasing, then the data should be produced in real-time, not once a year. Additionally, this use case speaks to the completeness of data; where revenue transactions shouldn’t be missing records or contain inaccurate fields. Specifically in my organization, this might mean identifying cyber security vulnerabilities in real-time or monitoring new-hire onboarding on a regular basis.
Additionally, automation is especially important when we have systems that support the continuous flow of data. In these systems, there are orchestrators and automation is required to ensure the completeness of data. In other words, how do you take incomplete data and then turn that into something valuable?
“How do you take incomplete data and then turn that into something valuable?”
What use cases are you seeing, and what problems are you trying to solve?
Every organization has different roles that will have different use cases; this includes analysts, accountants, consultants, et cetera. And while all of these roles have different use cases, they all share a common problem; a lack of automation.
When anyone in one of these roles needs to pull data from somewhere, they’ll often use Excel to pull the data together. They might spend an hour each week pulling together new or even recurring reports. Maybe they’re familiar with standardizing the report, or maybe they’re not and they need to figure it out. Pulling a report together could take up to three hours. And if the report needs to be created once a week, then that analyst could be spending 144 hours a year on something that could genuinely be automated. That’s only one person doing one task. If two people were required, then it would be 288 hours a year, almost six work weeks.
The alternative is simply automating this report by making it into a dashboard. This dashboard would be a URL that you’d click during a meeting and the data would be automatically orchestrated without the effort of anyone. In this case, there are 288 hours that two analysts could redirect focus from the process aspect of data and work on higher-value strategy work.
Another common use case that comes up is the simple problem of coordinating data. This would include someone who shares a spreadsheet around the office in order to collect inputs. There might be three people involved in creating the report and they’re all trying to make their inputs. This manual process includes fat fingering, manual mistake, and reconciling the inputs. Ultimately, we’re looking at about 960 hours a year of a work year for three people, and you may not even have full confidence in the data.
Alternatively, you could just send out a webform with validated fields. People would directly input data. Super quick, and super easy. Heck you can even say, here’s an Excel doc that someone gave to me and they wanted me to do something with it. Instead of having to share that around and coordinate, you just upload it. Boom, the inputs go into SQL Oracle, or whatever your backend is, applies the logic, and pushes it to a report, maybe Tableau or Power BI. Boom, you now saved 960 hours a year so that you could do more analysis and less process-oriented work.
That’s cool and all, but what is the ultimate goal? Well, this is what is critical and emerging. Technology kind of reflects the data game. There are so many aspects of data that we could talk about. But, one of the key aspects are the integration of technologies. I’ll tap a little bit into the strategy consulting side of things. Strategy consulting, among the different titles, is what competition is all about. You can look at the corporate environment, titan-class companies, or whoever it is that integrates the latest greatest technology. The integration of technologies is not just cool, but rather, these technology integrations genuinely do yield faster results.
When you integrate better technologies, you can comprehend, assess and develop strategy faster than other companies. From a game theory perspective, he who makes or he who has the money makes the rules in this game. To put it another way, he who has the data makes the rules.
“He who has the data makes the rules.”
With better data integration, the first mover advantage, you’re able to derive insights faster, which speaks to the timeliness attribute of data. You know, if you have systems that can do it fast, but can also handle massive volumes of data. And, if you’ve got lots of different technologies that are able to collect and automatically validate the attributes of data then your company has a competitive advantage. Mastering the mundane aspect of data, namely the processes and attributes, doesn’t sound all that great, but it actually aggregates up at the corporate level. And the corporate level aggregates at the nation level. In other words, he who has optimized data makes the rules, and you get first movers advantage. It’s all about time to insight.
What problems are organizations trying to address and what types of data use cases are you seeing?
This is where I start talking about the more attractive use cases of data, like machine learning and artificial intelligence. Those are buzzwords, but when you look at it from the perspective of middle management and tactical level employees, the difficulties they have are in finding valid use cases.
I think largely the problem is because people are looking for the really provocative stuff, they hear all the buzzwords, they see all the conferences and they’re so excited and they want to change everything and, frankly ML and generative AI is phenomenal but the vast majority of analysts out there don’t need that. It’s kind of like finding the right tool for the right job. You don’t bring a hammer to wipe off a mirror. We don’t have a whole lot of use cases for at least discovered and overtly articulated value in applying tools like GenAI or chat GPT. These technologies are solving big problems, however these problems and use cases are pretty narrow.
“You don’t bring a hammer to wipe off a mirror.”
So what does identifying machine learning and artificial intelligence projects really look like in a day to day world? It’s adding a column to a dashboard that might show the probability someone will sit in their assigned seat. When employees onboard, their assigned projects, where they’ll be working, and who they’ll be working with are analyzed by HR, or some case manager. And HR uses this information to determine where that employee will be assigned to sit. So this specific use case is related to the original butts and seats problem – do people actually sit where they say they’re going to sit? Is that a machine learning problem? Absolutely. And it’s a classification problem, versus a regression problem.
On the surface, the result of solving this problem is mundane, but it’s actually a big deal. The result of these problems tend to be a new column and might indicate the probability that employees will actually sit where they’re assigned. Why should leadership actually invest in this mundane feature? Well, because that one column could determine millions of dollars worth of direct costs.
For an enterprise, if you have thousands of people sitting in places where they’re not assigned, especially if it’s not on company property, then as a result multiple spots might be provisioned for one person. More importantly, if you have new hires, transfers and partners who need spaces, then they can’t start the work they are contractually assigned to do because there’s just not enough spots. That’s a problem. But you can optimize this coordination problem by tracking logins against seat assignment locations, and then make the easy choice of allocating seats based on probability. Amazon built their empire through probability optimization in their supply chain. Believe it or not, this actually can save an organization millions of dollars or more.
Finding the Right Kind of Tool for the Right Kind of Problem
Tactical and middle management tend to have problems with identifying the right kind of tool for the right kind of problem. For instance, when do you use SQL versus a noSQL tool? Well, noSQL is able to handle a 1% edge case where you need to monitor huge volumes of continuous data, such as Facebook feeds. However, these use cases are rare, and for the most part 99% of problems can usually be solved by standard Oracle SQL, and MangoDB.
Articulating the Higher Level Impact
Identifying use cases for emerging technologies and AI can be challenging because leadership hasn’t adequately articulated the higher level impact. Being able to see the higher level impact is the most exciting aspect of implementing emerging technologies and AI. With a clearly articulated outcome or vision from leadership, middle and tactical level and can better identify appropriate use cases. Middle management and senior leadership hear the buzzwords but they don’t see what the outcome really is. And, and I think a lot of them just can’t see it and it’s not articulated super well to them and so it’s not well articulated to analysts and employees.
What obstacles are leaders facing and what are they doing to overcome these obstacles?
There are a couple high-level buckets that leaders are experiencing. The first bucket involves defining vision, we’ll call this bucket identifying use cases. The next bucket is articulating the use cases; knowing how to find them, recognizing pain points, recognizing that a pain point can actually be solved, and then challenging your assumptions. Then, hand in hand with the use cases, is identifying all the leaders and stakeholders who need to be persuaded.
Persuading Leadership
You know it takes more than just convincing the Division Chief, the Department Assistant, the Colonel, the Director of your Marketing Department, et cetera. In any organization, you’ll have many leaders who will love the idea of the LLMs and GenAI. These leaders love the idea of automating repetitive tasks, and they love the idea of optimizing attributes of data, like timeliness and reliability. But they need to be able to justify these initiatives to leadership at each and every level. Each layer of leadership needs to be considered because each layer has a different strategic objective.
At a tactical level, leadership might be resistant to change because of the risk of changing technologies. When shifting over to a new technology, there are many dependencies to consider. For instance, how do you shift over to a noSQL environment, while minimizing or mitigating any unintended consequences? By not cutting over all at one, we can shift segments of the organizational capability to the new technology in phases. Those are the types of considerations that you’re going to have at the middle management level, and then at the senior leadership level as well.
At a strategic level, leaders might be presented with an opportunity to invest in a $5 or $10 million plus data warehouse. Given the risks of that investment and risk of cut-over, they’ll ask if the organization is really at a point where they need that. Are they actually at an existential tipping point? They’ll ask if there is a way to do a segmented shift. Those are just some of the things that need to be considered and when driving a technical and cultural shift you need to be able to convince individuals at every leadership tier.
“Are they actually at an existential tipping point?”
Resistance to Change
Now, let’s talk about the other category that I was thinking of, which is resistance to change. This part is heavily focused on culture. I’d say the chief obstacle in my mind is convincing senior leadership that it’s not just about identifying who all the stakeholders are, like in Persuading Leadership. You can convince a senior leader and get them all on board. The real obstacle is, and let me emphasize this point; it’s the individuals at the most tactical level that I find is the hardest part.
So let me reframe that the chief obstacle is really convincing the individual level to buy into, shifting their attitudes and shifting to a new technology. For instance, an Analyst might say, ‘hey, I’m an Excel Ninja and I don’t need to use Python. I’ll never touch that stuff. If I’ve been able to do my job well before, then why would I change now?’ The issue is, how do you convince this individual that change is necessary?
This actually makes me think of a book from the nineties on organizational behavior, written by Spencer Johnson, called ‘Who Moved My Cheese?’. This book illustrates one of the main challenges faced by organizations; which is change in operations and culture. Operations being that technical part. Everyone’s industry is constantly influenced by new methods and new technologies, so the inner workings and people have to adjust. Johnson’s parable uses mice in a maze representing different reactions that archetypal characters have when new incentives are introduced. And in the book’s case, this incentive was cheese. Incentives are meant to motivate the mice. He talked about requiring the mice to relocate to another part of the maze so that they could do their jobs more efficiently. And he discusses the many categories of mice, or people. The last two mice to relocate to the new part of the maze were those who dragged their feet in response to needed change, and then those who flat out resisted it. These were the mice who simply said, ‘I won’t’. The book goes on to talk about the main cultural aspects and implications of this scenario.
This parable stood out to me and related to what I’ve seen when implementing new technology. This included my experiences at the Pentagon, at the State Department and also beyond. What do you do and what kind of incentives do you give when you need to implement change? Unlike the private sector, the government is a bit more limited in the assortment of cheeses they can use to motivate their workers to adopt technology. So I’d say that that’s the chief problem there. Now, what’s going to be the solution to that? Well, when we’re talking about the solution to resistance to change. In the book, he emphasized that it’s a lot easier in the private sector. In the private sector, it’s like let somebody go. You dangle the cheese above the mouse or employment above the person and say, ‘look, we’re changing’. And that’s what it is. It’s kind of the bottom line. Well, in government, you don’t get to do that. Maybe you can quarantine people, you can move them. But none of this really helps.
So, how does all of this tie into data? Well, technology these days, the big change in technologies is how to harness data, do it more efficiently and quickly and how to capture all those other attributes that we talked about. The people who resist those technologies in aggregate have a net negative effect on the organization. Let me tie that into the nation-to-nation or company-to-company strategic level. It is at that most granular level of people integrating these technologies into the workflow that you get that aggregate advantage. If you’re able to get those insights faster, and you’re the one who knows how to act before the other guy—and it really does legitimately boil down to, you know, people using these technologies—then you’ll have an aggregate advantage. Now, obviously, that’s a pretty sophisticated problem.
“It is at that most granular level of people integrating these technologies into the workflow that you get that aggregate advantage.”
What are the characteristics of customers who have successfully transformed their cultures?
I’ve got some thoughts on that one, but let me pause to clarify something. Assuming this is related to the role of leadership in identifying granular types of projects and use cases. This contrasts with describing the role of the individual in identifying projects and use cases, which aggregate into strategic advantages.
It is as you know, going to be a back and forth where you have an initiative that comes from above, you see how much people are actually responding to it. Collecting data, you see that there are major gaps in the acceptance and leadership and reforms their approach. Do I need to change the process that I’m using in order to collect data? Do I need to change my messaging? What other incentives can I adjust in order to get people to adopt it? All parties working within the constraints of self interest as well as material constraints. And I’ll put things like time and patience and, I’ll even go as far as to say, talent, such as the ability to learn technologies.
Here’s an interesting use case to emphasize this point. It involves integrating new technology, specifically the problem of onboarding new employees. You have lots of data, you have repositories, and you have employees who are pulling data and then manually creating Excel reports. It’s not automated. It’s very manual, and maybe they’ll send it around to have people check. Maybe they won’t, maybe there’s a standardized process, but maybe there isn’t. So where does the advantage come in for integrating technology and then people using it? If you’ve got the people who don’t have standard operating procedures (SOPs), then they may not realize that this task could be automated. Manually aggregating data and creating a report takes a lot of time. It takes a lot of time to just generate a report. This basically means that employees are wasting time and energy.
One of the biggest opportunities that is treated as a weakness is looking at that and saying, ‘oh, our tools are not good enough.’ No, no, no. The tools are actually highly valuable. And this is where you get into that very important, significant insight that I think is a game changer. This game changer is in how you let the data tell the story. For instance, an analyst might complain that they’ve asked other teams to input data but they’re not. Leadership may say that they need to close the data input gap. In this case the data itself is informing policy and cultural discussions.
“This game changer is in how you let the data tell the story.”
Leadership, instead of saying the data is not complete or there’s mistakes with it, could say instead, you know what, I don’t care. I’m actually just going to make an application that shows the data as it’s being input. With enough improvements, that application is eventually able to tell a story on it’s own, in the form of a reporting dashboard. This might be a Tableau dashboard on a Tableau server.
By taking this approach, the data is informing SOPs. This is the kind of interplay and feedback loop required to build technical solutions that work for each unique organization. It’s incrementally taking shortcomings and transforming them into advantages. Individuals who are on board with this approach, will actually have significant advantages where they subscribe. They’ll start to adjust, they’ll use the technologies, and they’ll show the use of technology. They start to share the use cases quickly using technology and delivering value. They start to disseminate the value of this stuff. Senior leadership will see that and give them promotions, and they’ll ultimately benefit.
Senior leadership sees them doing that and acknowledges how much more effective these new technologies are in providing all those attributes of the data. It also equips them with the ability to tell the stories that they need in order to convince people to fund their initiatives and organization. It also gives them the ability to identify gaps and build tools. It also gives them the ability to hold their organization accountable and see who is following expectations. Yes, we want to save face and we want people to feel comfortable, but we also need enough discomfort that those who are resisting will follow. And maybe that sounds a little dictatorial, but it must be the objective of the individual to help the institution accomplish its goal.
Change is implemented use Case by use case. And leadership might consider how much insight they need for each area, whether it be onboarding, payroll, or marketing. Since no one area should be neglected, it’s a significant opportunity for senior leadership, as well as tactical level contributors and middle management to be able to use tools and technology to shine light on the cobwebs of problems [with data]. Then ultimately iron out those problems, develop good datasets and ultimately achieve a high-fidelity storytelling capability.
Now, how does that change the game? Well, think about cyber security. Cyber Security is a major priority. Everyone’s so excited about it. However, no one actually knows all the different pieces until they’ve dived in. Once they’ve started, they start seeing that there’s significant problems with data, all the time, everywhere. All the attributes of data are having problems. Well, that’s the kind of thing that you really want to have, right? You apply that strategy, it changes the game. As a result, someone who has high fidelity of data, can speak with very high confidence about the vulnerabilities in the enterprise, and then leadership can prioritize initiatives to close those gaps.
How can leaders establish a vision that drives a data culture?
While at Booz Allen, I was a consulting Analyst, or a “Senior Whisperer”. In that role I worked with the United States Air Force. At the time, the Air Force Chief of Staff at the Pentagon was driving a heavy emphasis on building a data capability. The Chief of Staff initiated this transformation by first disseminating a vision. In similar situations, senior leaders usually don’t have a clear vision. They tend to follow the cues put out there by the more data-oriented leaders in their orbits. From what I’ve seen, these leaders will work with other people who are more data heavy and then disseminate that message.
While not directly involved in these discussions, my understanding was that the Chief of Staff was saying things like, “let’s be knowledgeable about what it is that we’re trying to define with data.” and “let’s make sure that we know what it is that we’re talking about first.” He established his vision and spoke in Air Force terms like, “if we want to be able to operate at speed of need, then we need to have speedy data”.
The Chief of Staff first established the goal, the vision. However, we also needed to have confidence in what it is that we were saying. So if we want to know that we had all our jets ready to go. Since that was the case, we needed to know how many jets we had, we needed to know what their state of readiness was. We had to ask if a readiness metric was defined and whether we had the right data sources. At the time, I was privy to a couple of those types of conversations. There was nothing in the way of describing or explicitly targeting any of the attributes of data. But the directive to improve data availability, completeness and reliability was implicit. And I knew that was the right way to go based off of my previous conversations.
Rarely are leaders able to know exactly what it is that they’re looking for. Especially if it’s a transformation into a space that they’re not used to. There is almost a phase where leaders are like, I know that it’s good because other organizations have succeeded with it. They might know that data is valuable and rely on it heavily. But are they necessarily curious about other technologies, and how different those technologies could enhance the quality of their data such that it actually increases their confidence and decision making? From what I’ve seen, it’s not that clear to most of the leaders I’ve interacted with. Maybe it’s a little bit more this way in the private sector. But if we’re talking about the public sector, then the thinking around data isn’t always so strategic.
Culture is very much a negotiation among many people. As human beings, we’re hierarchical creatures, and that doesn’t mean we just fall in line and embrace what’s being said. It’s very much a negotiation, and often takes the form of a tug of war. As a leader, you might offer a new framework to your people, and if they kick back, then you then ask, what can be done in order to get them to buy in? Is it coercion? As a leader, should I start pulling out the sticks? Or is it, rather, going to be through a method of enticement? Do I pull out the carrots? In other words, do I reward or penalize?
The people I’ve worked with have always been very sharp. And if they’re already predisposed to thinking in terms of data, then they’re more likely to accept these new frameworks. But some people operate primarily with manual processes, like sending out spreadsheets and kind of hand jamming reports. So it would have been up to the individual to know how to achieve that, how to implement that or which technology they can implement.
If we’re using government agencies as examples, no single data culture is a template that will fit all places. Consider nation building for instance, and how much of a disaster that was to take democracy and ram rod it down the throats of Middle Eastern nations. The fundamental belief that every given government, or every people, throughout the world would function better through the lens of a democratic system just does not play out in reality. It’s not necessarily a bad hypothesis. But at what point do you say that the resistance or the negative effects of trying to force things a certain way are greater than just leaving things as they are? Alternatively, at what point is it better to slowly wean people towards a goal? Do you implement a rapid shift? Should it be a dictatorial approach where leadership mandates a shift? Certainly senior leadership has measures to control culture, and they have a lot of different tools to implement those controls. I think this is especially true when it comes to implementing critical and emerging technologies, where new initiatives need to be driven through a culture that emphasizes proof of concepts.
The higher up you get, the more it is a culture game rather than when you’re an individual contributor and can operate in a microcosm. For instance, the Chief of Staff I mentioned before had already seen the results of successful data integration initiatives in prior positions, and intuitively knew that the aggregate of building a data-culture and shifting each department up the enterprise level would have massive positive returns.
However leaders don’t necessarily always know what that transformation looks like. They don’t know the exact path to integrating transformational technologies. They don’t know if their people will refuse the initiative and they don’t know what types of problems people will bring to the table. They might say your approach won’t fit here because the reality is different for some of them. There are, in a government sense, initiatives where the juice just isn’t worth the squeeze. Sometimes you can’t justify the process and the cost for something that won’t reduce the amount of time people actually put into it.
Are there stages of maturity? If so, is there a stage at which it makes sense to scale a data culture transformation?
There are stages of maturity to developing a data culture. The first, a leader might come in and you say, ‘hey, here’s the vision. We need to start somewhere. Let’s start by defining key objectives and start working with a couple of flagship departments so that they can have a proof of concept for other departments to follow.’ You ultimately want departments to get jealous of each other. You want them to have FOMO. It’ll motivate a lot of people. Then once you’ve got a couple of early wins, like small scale experiments that are successful. And, and I speak very heavily from the experience I’ve had with my current organization. Pretty notable data shifts or data culture shifts and, and when we’re talking in terms of, how to tactically accomplish the transformation, frankly, the biggest hurdle is people just doing things.
It would be a difficult job for a leader to go into an organization, sift through all of the different rhetoric, and then identify every exact place to integrate data. As a leader, you need to genuinely listen to people when they say, ‘hey, it [a data integration] actually would not make sense here.’ So then you make an exception for certain areas. And you know, that’s what the negotiation looks like for changing a culture. You can’t just force it on people. You have to be knowledgeable about where you can and can’t integrate technology, or where methods shouldn’t change.
How do organizations initially develop a data-culture?
Getting Started
Leadership likes to point out that you just, just start, right? Everyone loves the idea of those words. You know, you can have all the right talking points from leadership. You can have all the tools on hand, but it’s just the same as having a bunch of educational books or videos on the shelf. If you don’t pick them up and read them, if you don’t watch them, then you don’t actually learn anything.
And so too, when you’re integrating data, you have to have those little wins by doing it by trying it out. It’s going to be messy. You’re gonna have trial and error, but being afraid that something is not perfect right off the bat is the biggest enemy to progress. And these are all things that people, they’ll know and have heard for so long. But it’s true here too, right? So what are the biggest hurdles, you know, it’s hesitancy, people not wanting to do it, people being against it, lack of vision, and finally, a lack of talent.
For an initial win, should leaders choose a mundane opportunity, i.e., automation, or should they go after a cutting edge initiative?
Identifying Early Use Cases
In my opinion, you’d want both. You want a project that is compelling, but you also want something that demonstrates real value. A compelling initiative might be marketable, but a mundane initiative might demonstrate that implementing technology is actually very doable. It’s super simple and easy stuff for us to do and it’s low hanging fruit. If people feel like it’s low hanging fruit, then they feel they’ve got a win very close at hand. So you’re given kind of that bigger vision while we’re cutting edge, but also the smaller, you know, I can actually do this.
Use case hunting perhaps, or you’re hunting down all of the different places that it could apply because it’s no longer that theoretical. It’s not broad talk, machine learning, AI. It’ll change your world. But now, I have very direct use cases. It saves me 20 hours a week on my team that goes into analysis. Now, it saves me 20 hours worth of assessments and analysis for in my case, like finding those seats. Where you want to be able to sit people, you know, X number of seats are actually available as opposed to being taken up the way that they look like it. You’ve been able to define it. We’ve now got those use cases that are demonstrating just how valuable these tools are.
At this point, it’s not theoretical anymore. So leaders start gathering up the use cases and prioritize based on budget requirements. And you know, as a senior leader, you’re going to want to start ranking priority, what’s more important, what costs or what is it that we could, we could fund, you’re thinking from political wins as well. What are things that we could, we could push through that won’t be too expensive but also will be attractive to peers. You know, there’s the competition element between, you know, the military, between intelligence organizations and between other departments. That’s very much a very much, like you go road showing and, and make people a little bit jealous. You know, you got that external excitement. If you feel like you’re behind, you start picking it up. If you feel like you’re ahead, people want to be like you and then you feel good about that. All of that feeds into the sort of, you know, identity that you have as an organization.
Implementing a Cultural Framework
And then once they’ve done it, you know, once they’ve found that perfect project that’s been successfully done, it’s no longer theoretical. It can now go from, from a curiosity into an expectation. ‘Hey, they did it. So why would you not be able to? If you’re saying you can’t, then I might need to get someone else who can because we know it can be done.’ And then that’s where, that’s where you would implement a cultural framework. What would that look like at that point? Just use case by use case inefficiency by inefficiency, going through the lens of the framework. If we’re talking about skepticism and persuasion in the beginning this would be the next phase, which would be more about, actually I think how you framed it.
Scaling the New Cultural Model
The Final phase would be Maturation. Once you’ve nailed it, then you’re going to scale that early success. You’ve got a couple, you know, it works, boom, you move into the, from theoretical, into tangible. Now you’re scaling it, you’ve got your solutions, you know that it’s viable, you start to implement it into a lot of places. That last phase and this is obviously over simplified, but you got that last phase that’s going to be hardening. You got the new technology, you’ve created SOPs and now you’re finding all the use cases or all the edge cases to say what are the capabilities and what are the inabilities of these technologies? You, you’re locking down those final pieces of the logical problems that people run into. And we’ve now got a couple different technical roles that have shifted to helping with those types of logic issues. And, you know, it’s now at a sort of monitoring stage. This is where you’re just monitoring, making sure that the new tools and all these new methods are actually continuing to yield the results that we want.