What use is data in organization design?
As access to tools that provide rich data about organizations proliferates, companies have a growing abundance available to inform their projects. Organization design projects require both quantitative and qualitative information, but “how much is needed?” and “to what end?” are questions I am increasingly asked.
Do we need data and, if so, what types and how much do we need? Where do ethical considerations fit into this? How do we focus attention and not get lost in a sea of data? Will benchmarks help or hinder our work? These are all questions that clients ask when undertaking an organization design project. I consider all of these in depth below and point to a number of important considerations, including the need to:
- Be clear about the question you are asking, and evaluate the scale and complexity of your organization design project to inform the data that is required.
- Collect data that will provide insights into the efficiency, cost and skillset needs of the organization, while also providing a basis for understanding the impact of changes on performance.
- Take into account the acceptability and legality of data you plan to collect, given current concerns about how it may be used.
- Use innovative graphical tools to show data in ways that focus on the salient information and avoid psychological biases.
- Use benchmarks and best practices carefully and only when their relevance and how they may be used is clear.
The art of organization design has evolved from the days of “manpower planning” in the 1980’s, which largely involved figuring out how many people to employ and where they would report on an organization chart. Today an armory of HR systems exist, like Workday and specialized organizational tools like OrgVue developed by Concentra Analytics, that can provide immense amounts of data and modeling capabilities to inform decision making within an organization that go beyond headcounts and reporting lines. However, as we all know, it’s one thing collecting data, it’s another turning it into actionable information.
Do we need data in organization design?
Most organization design projects end up with a lot of roles and people decisions being made based on qualitative factors and the intuitions of the most senior person involved. This is not necessarily a bad thing when it comes to a small group, given that selection for unique roles needs to be assessed on ability to work in a team, to adapt to changing circumstances, the relevance of past experience and levels of commitment. After all, as the saying goes: personnel is policy. However, it starts to look decidedly shaky and less reliable where large numbers of roles and people are involved. Why? Because data helps clarify priorities, ensure consistency across large groups, and reduces the likelihood of ad hoc decisions scattered across the organization. Data can also help to counterbalance the risks of supposed “experts” carrying too much weight in decision making, without checking for supporting evidence (known as expert fallacy or false authority in psychology).
Organization design projects, more often than not, derive their impetus from one or more of three major drivers for change: strategic considerations (frequently the result of appointing a new CEO), the need for process changes (e.g. changes to how the supply chain needs to operate), or new technologies (e.g. implementing enterprise wide systems or moving to the digital cloud).
The nature of the change driver will define the objectives and scope of work which, in turn, will clarify the data needs. As one of the founders of business history, Alfred Chandler, pointed out many years ago “growth without structural adjustment can lead only to economic inefficiency”. The crucial point being that it pays to plan ahead and be selective about what data sets are needed, to avoid being lost in a sea of data and potentially chasing irrelevant, albeit interesting, anomalies.
In thinking of how scope impacts the volume and types of data needed there are essentially two dimensions: scale and complexity. Scale refers to the volume of roles and employees involved; complexity reflects levels of ambiguity and creativity required, the number of handoffs, and the relatively unique circumstances of roles and decision making (admittedly, not all of which are necessarily easy to apply data modeling to). The more scale increases, the greater the volume of data points that will be needed; the greater the complexity the more specific the requirement. (Put the two together and you will need more than a spreadsheet and junior statistician to do the analysis.)
For example, many organizations rethinking the structure of a sales organization will involve a high volume of individuals but the complexity may be low (i.e. top left hand quadrant), whereas doing the same with, say, a drug development team in a pharmaceuticals company would likely be smaller in scale but high in complexity (bottom right hand quadrant). Compare these with an enterprise re-organization that will require diverse and deep datasets (top right hand quadrant). In each case, understanding the scale and complexity of the undertaking will help clarify the thinking of what needs to be known.
So, there are reasons why in certain circumstances less may be more, but in most cases there are good reasons why solid data can be useful to most organization design projects to inform the choices that will have to be made.
Be clear about the question you are asking, and evaluate the scale and complexity of your organization design project to inform the data that is required.
If data can help, what types are needed?
Having defined scope and clarified scale and complexity, we should turn our attention to the
myriad types of data sources that might be useful. The trick is to limit this to what is truly necessary based on the question to be answered. Typically, I find that there are 4 core reasons to collect data:
- Connecting design changes to business performance: This requires data such as revenues, cost-ratios, strategy targets, productivity, customer satisfaction scores. This facilitates the potential for linking potential changes to targeted business outcomes
- Clarifying what the organization structure is costing: This includes data on numbers of employees (FTEs versus actuals), employee costs, hours worked, contractor costs. This provides clarity about the price being paid for the value being created (or not!) in the current organization.
- Evaluating the efficiency of the organization: Here we are typically looking at spans of control, activity analysis, number of handoffs in decision making, levels of management, number of job types, employee engagement scores. This, combined with the cost information, enables priorities to be established based on where the greatest impact can be made.
- Identifying gaps and insufficiencies in skillsets: This is where we dig into skill types and their match with future needs, the quality of succession plans, performance rankings, and demographics. This ensures alignment of capabilities that the structure is designed to support in pursuit of achieving the business strategy.
The data can be quantitative or qualitative. The latter generally comes from the opinions of employees and serves two important purposes: better understanding what lies behind the data, and visibly providing a means by which leaders and their teams know their voices have been heard.
Not all changes to structures can easily be connected to overall business performance. However, it goes without saying that a connection to improving some type of performance improvement should be expected, even if it is qualitative (such as higher employee engagement scores). The higher the scale and complexity, the easier it is likely to be to draw a connection to overall levels of performance. The opposite is also true, that lower scale and less complexity, is likely to show up in more localized benefits. In other words, not all design efforts involve the same level of data collection and analysis.
Collect data that will provide insights into the efficiency, cost and skillset needs of the organization, while also providing a basis for understanding the impact of changes on performance.
“Being good is good business”
The potential for collecting increasing amounts of data in ever increasing detail, necessitates attention to not only how it will be used, but how employees will respond to learning data about them is being scrutinized. The EU GDPR laws have reinforced the need to give careful consideration to the potential intrusiveness of data collection and privacy concerns that can arise from its use. As an example, recently in the UK Barclays Bank sought to use data on laptop usage by investment banking employees as a productivity measure. As a result of the backlash the company had to promise to only use the data anonymously and in an aggregated format.
So, once again, there are good reasons to give consideration to what data is being collected, why and who will have access to it – not least because of the new laws that are evolving, but also because of changing expectations that employees have about their rights to privacy and how information about them is used. As the late Anita Roddick, founder of The Body Shop, recognized way before we had the current laws to enforce it, “Being good is good business”.
Take into account the acceptability and legality of data you plan to collect, given current concerns about how it will be used.
Avoiding blindspots and misleading information
Organization design can get bogged down in numbers. Tables with huge volumes of data can be misleading, and hard to absorb or understand. It can be easy to convince an audience while actually misrepresenting the facts; for example, saying average x is bigger than average y may lead us to assume something about the relationship, but if the differences aren’t statistically significant we should probably be ignoring it.
Plus, we are all susceptible to what cognitive psychologists call bias blindspot in how we interpret data (seeing the biases of others but not our own). Couple that with another phenomenon known as anchoring (focusing too heavily on an initial piece of information) and it’s clear that the more attention focused on salient information, the better.
So, it’s important to ensure the analysis is both relevant and not misleading. Fortunately, these days there are analytical tools available that are visually impactful and can focus attention on key issues, the options to be considered, and facilitate the decision making process. For example, tables of figures showing spans of control across an organization and down through each layer of management can make it hard to isolate differences. Whereas an icicle diagram (middle row, top graphic below) can make it easy to see where differences and anomalies exist. Instead of Excel spreadsheets, think sunburst visualization. Rather than stacked bars, use sunflower visualization. Use heat maps instead of layer stacks.
Use innovative graphical tools to show data in ways that focus on the salient information and avoid psychological biases.
Do benchmarks help?
Benchmarks and their close sibling, best practices, are often conflated in peoples’ minds, but they have different uses and values.
Benchmarks can be useful, but also problematic. All too often I have seen that when they appear to confirm a set of biases and preconceptions (see above), they are greatly valued; but when they provide an unwanted answer, they are “interesting, but not really relevant to our unique circumstances”! A common problem is that the question to be answered when collecting benchmark data has not been sufficiently clarified, so the data sets that are collected are overly broad and imprecise, risking distracting arguments about their relevance rather than helping to focus discussion.
No doubt, they can be a useful catalyst for debate that can mobilize energy around the need to change and they can provide value in other ways too, but not always to the extent that warrants the time and money involved.
Best practices, on the other hand, can be more useful. By focusing on examples of how work is being carried out in practice, they help provide a more concrete representation of possibilities for how work might be done differently in future. Certainly, they suffer from some of the pitfalls of benchmarks, but often have more practical relevance and contribute more directly to brainstorming discussions than benchmarks.
Use benchmarks and best practices carefully and only when their relevance and how they may be used is clear.
In summary, it is rare that data and analytical capability are not required in most organization design projects, but the type and volume of need and what it will help you solve are important to determine up front. Lost time circling back for data that should have been in scope at the beginning, or chasing down interesting, but irrelevant, rabbit holes can dramatically increase the risk of stalling a project.
Next time your organization is planning a redesign, taking account of how data contributes to achieving a successful outcome will be important. If you are initiating a project and we can help, let’s chat. Redwood Advisory Partners has decades of experience supporting clients through the design and implementation of organizational transformations.
Follow or connect with the founder, Stephen, on LinkedIn (where you can find other articles in this series) or visit our website at www.redwoodadvisorypartners.com. You can contact Stephen direct by email at [email protected]
- Chandler, Alfred D (1962) Strategy & Structure, Martino Publishing
- Being Good is Good Business: Roddick, Anita, (1942-2007), Founder of The Body Shop