Are You Losing Money to Poor Data Quality?
In order to avoid contributing to the trillions of dollars lost to poor data quality, executives should become drivers of change and know how to talk about formulating a data strategy, keeping in mind 4 key components:
As an executive in global real estate, your days are filled with trying to understand how changes to asset performance, access to capital, and market conditions will impact your returns. To get these insights, you often rely on tens if not hundreds of people across your organization to consolidate large and complex data sets across the departments and regions of your business.
So while many executives simply focus on making sense of the data they receive, few executives question the efficiency of the systems and process that got it there in the first place and even less explore what they can do to improve it. This is a missed opportunity.
This misalignment from senior management to front-line staff results in wasted capacity among staff who could be spending most of their time answering questions that are currently left to a handful of executives. To realize these benefits, executives must shift their focus from properties to processes and have the knowledge to express their concerns with the right department who can improve performance of the team when needed.
For some people, creating a plan for improved data management and intelligence gathering will come more easily as the steps to defining a plan seem logical, just work backwards through the chain of gathering and see what can be automated. Sounds simple right? The trouble is, with the sheer amount of data you can now collect from numerous internal and external sources across multiple regions and sectors, aggregation and standardization can become difficult and is often why it is ignored. In the US alone this poor data quality has resulted in over $3.1 trillion dollars lost to their economy. On this scale few businesses will be exempt, giving good cause to address it in your business.
Here are four steps to get started.
Before beginning to think about the systems themselves, it is important to first establish the goals of the project that will be a meaningful driver of change in your business. Many executives will find reasons to support or avoid these changes so ensuring you have identified a compelling reason is essential. Reasons can include:
- minimizing risk
- compliance with regulations (i.e. financial reporting)
- integrity of data
- increasing revenue and profitability
- integrating asset and market data
Once you have found something that will stick for your team, it is important to define the architecture for your data that will create a strong foundation on which to align all of your information.
Define a Data Model
The first step in the process is to acknowledge that a set of standard definitions for how data is collected is a necessary prerequisite to reliable data. These definitions provide the context in which data can be understood, and they make a real difference to the insight that can be drawn from it. A very clear example of how differing definitions can lead to very different results can be seen in measuring floor areas. Superficially, this seems a straightforward task where there would be a standard approach and standard results. Yet, research by Jones Lang LaSalle showed that differences in methodologies used in area measurement can lead to variations in results as high as 24%.
The underlying data model is the key to any data warehouse’s ability to solve this issue by aggregating data through a specific and defined method. In order for a data model to be successful it must accomplish a variety of things. It must accurately represent the true relationships between entities (e.g. assets, tenants, loans, etc.) in the model but also be flexible enough to cope with different levels of aggregation when it is not possible to collect information at the same level of detail from all sources. Most importantly, the data model must be precise, with every field clearly defined, along with the rules about how fields and entities relate to one another.
Without this standardization, data definitions become ambiguous to the point where having consolidated information may become a mess of information.
Data Validation & Accuracy
The downfall of any data collection initiative is the familiar “garbage in, garbage out” problem. In order for the solution to be of any use, you must be certain that the data you have collected is complete and accurate. Although a method of input has been standardized, there is still room for human error in the inputting of that data. Therefore, the ability to effortlessly and automatically validate incoming data based on rules that can be easily created and modified without onerous development and testing is crucial.
Standards and rules provide consistent, comparable data across all regions and asset classes. For a system to be successful in validating data, rules and logic must be developed that allow the system to detect data quality issues before they make their way up the system into your hands as an executive. In general there are two types of rules that must be established:
- Technical Validation– Ensuring that that the incoming data satisfies the desired formats and other input criteria. For example, making sure that a field that requires a number does not instead have a character in its place or is in the specified format 1.00 vs 1
- Business Validation– Ensures that standards specified by your organization specifically are met so there are no discrepancies in the data. For example, the rentable area should not exceed total property area.
If the rules are unclear to a system as to how it should interpret and consume information, a single source of the truth can quickly become just a single source.
Leverage Actionable Data
Finally, the data warehouse will only be of practical use if it is easy to both import new data and to access up-to-date data when it’s needed. Data will need to be imported from disparate sources, such as software systems and spreadsheets. Without mechanisms for making these outputs available to the correct teams across the business, work may be duplicated and potential may be overlooked.
Until recently, limitations on technological solutions have made it extremely challenging for organizations to establish the data architecture necessary to carry out a comprehensive data strategy. However, recent advancements have enabled organizations to simplify the implementation of rolling out a solution. On top of this, with hot topics such as machine learning and distributed technology solutions (i.e. Blockchain) have suggested that there are even more advancements to come in the future. In order to avoid falling further behind, businesses must start planning their data strategy today.
With a few simple steps like these, executives can be well set up to become drivers of change in their organizations and will see greater benefits by focusing on making their staff more productive than on their asset returns.
 IBM. 2015. “The Four V’s of Big Data.”
 DalleMule and Davenport. 2017. “What’s Your Data Strategy?” Harvard Business Review