The Commercial Real Estate Industry Needs to do More in Leveraging Machine Learning

pg_dmodml_052119

Machine learning and other cognitive computing technologies remain the hot, disruptive solutions marketed and touted by every software company out there, as the amount of money coming into AI-related startups continues to outpace other segments. However, the resulting “hype” has created a lot of unfortunate noise about machine learning’s value, and some commercial real estate leaders are having trouble navigating the buzzwords as they try to understand what this really means for them. Despite the noise, the power that machine learning can bring to an organization is absolutely real, clear and demonstrable, augmenting how we work and personalizing our experiences as consumers and employees. Instead of using rules-based programs, machine learning does just what the name implies: it learns from data and history to provide insights and patterns that are not found with the normal business intelligence and other standard, legacy analytic programs. On top of that, some of the most impactful results are those that leverage data that is sourced and specific to a company, building, employee or region, thereby providing personalized results that are unique and actionable.

A report by Deloitte last year revealed that early innovators leveraging machine learning, Natural Language Processing (NLP), computer vision and other cognitive technologies are already seeing business benefits. In the survey of 1,100 IT leaders, 55% said that their company’s adoption of AI has enabled them to increase their lead over competitors, with 9% stating that it actually enabled them to leapfrog ahead. That’s pretty compelling. So what about the commercial real estate industry? Is the industry taking advantage of the power that machine learning can bring? The answer is not yet.

This isn’t to say no one is leveraging any of the cognitive computing technologies—but its adoption has been limited at best. There are some wonderful machine learning solutions available today across all segments of the industry and I’ll name a few later, but first, why is the industry slow to adopt machine learning compared to other industries? Here are four big reasons:

  1. Data and knowledge intensive – The commercial real estate industry is fairly old compared to others, and it has historically been led by a relationship-driven, “run-on-my-gut” type of management, albeit with a relatively high success rate. That success rate feeds a legacy set of leaders that are skeptical of and resistant to change. Coupled with this, the industry truly runs on data but doesn’t have a great track record for compiling, housing, cleaning and leveraging that data. The legacy software applications were historically not open and did not support data integrations well, so consolidating the data took a lot of effort. This has been a major inhibitor in the industry moving as quickly as others in leveraging data and machine learning. That’s changing rapidly on the vendor side as new entrants provide more modern alternatives while pushing the legacy vendors, but there’s a lot of data already locked up in a myriad of systems across each company.
  2. Fragmented, with a myriad of legacy and modern applications – The industry has not attracted new software innovation in the past due to its heavy fragmentation (there are tens of thousands of owners, and even the biggest are relatively small compared to other industries). Though owning or investing in real estate sounds simple, the information needed to oversee and manage commercial real estate is fairly unique and broad (rent rolls, investor administration, complex lease terms, multiple regulatory agencies, tenant support, building operations, etc.) Therefore, it takes a collection of unique applications to meet end-to-end business needs, as there is no one application that truly meets all needs for every segment (commercial/multi-family, hotels, industrial, single family, geographical regions, etc.) On the positive side for machine learning, this fragmentation also adds to the volume of data that can be captured across the investment and operational lifecycle.
  3. Lack of attention to data quality – With all this data coming at every commercial real estate organization, very few have the data governance maturity needed to yield high quality data. A recent poll that I saw of fellow commercial real estate CIO’s showed that one of the biggest hesitations in moving forward with machine learning and similar technologies was a lack of confidence in their data quality. It’s not the only reason, but most CIO’s rightfully realize they need to have good data in order to achieve the best results.
  4. Resource constrained – On the earlier point that the industry is primarily made up of small- to medium-sized companies- this factor translates to organizations not having large employee bases, so they are inherently resource constrained. However, that is the catch-22 about machine learning, as it can make employees more efficient by automating the more mundane, operational tasks, freeing up more time to focus on customers and knowledge-based tasks. The employees will also be armed with new, data driven insights to carry out this work.

Ok, so now that we’ve reviewed why the industry has been slow to adopt, let’s focus on nine ways that machine learning can add business value to the commercial real estate industry, and some examples of companies that are already bringing value:

  1. Proactive and predictive insights on asset conditions and failures: One valuable use of machine learning that is getting traction and validation is the ability to more efficiently operate and manage a building’s physical assets and IoT devices. Specifically, the ability to convert the fault alerts produced by building equipment into actionable notifications, providing early warnings when important assets might be failing or need attention. There are a number of companies that filter through these warnings based on patterns and historical resolutions, while some are also looking more broadly to include other data points—such as work order history or manufacturer/device history—to identify and predict when an asset is in danger of failing. Fixing or replacing a piece of hardware proactively is much more cost-effective and incurs less employee impact, so predicting when an asset might fail can be beneficial. Some of the companies addressing these issues are BuildingIQ, Enertiv, and Switch Automation.
  2. Gaining proactive insights into tenant space needs, operational issues and other factors affecting NOI: Most of the “insights” available today are via operational applications or traditional business intelligence tools. That’s great for understanding what happened in the past, but these tools are not ideal for helping predict what might happen next. They are also not good at leveraging the disparate sets of data available in providing proactive property and tenant insights. By combining both internally sourced data (workorder, lease expiration’s and terms, parking, a/r, etc.) and external data (weather, tenant growth metrics, etc.), Okapi is one company actually using machine learning to provide insights in this manner for commercial and multi-family, while diffe.rent and home365 are a few examples in the single family space.
  3. Occupancy and space utilization, and the personalization of the workplace: Understanding an office’s space utilization patterns is one of the most impactful, but less optimized functions of a corporate real estate organization, with the biggest issue being the ability to easily understand how office spaces get used on a daily basis. With COVID, this topic has become even more important and critical to ensuring a safe Return To Office.  Some of the solutions being deployed are leveraging Computer Vision to ensure employees stay at safe distances and to confirm that occupancy levels are at a safe level. Understanding and predicting usage is also important to ensure that space is safe, while also supporting employee productivity and identifying future opportunities to support growth. Machine learning can analyze a disparate set of data coming from sensors, room booking, badging, and other siloed sources and highlight usage patterns that might be unique to a specific building, region or department. A few companies playing in this area are Digital Spaces, Density, and Vergesense.
  4. Enhanced tenant and employee engagement: One of the biggest trends in commercial real estate is the explosion of employee and tenant-facing apps that aim to connect users to the services and communities that matter most. They support conference room booking, facility and work order requests, class registrations, ride hailing, cafe menus, and many other features that are increasingly expected by today’s employees. The more sophisticated apps leverage machine learning and historical data to suggest specific conference rooms, advise of non-bookable working spaces that might become open, or recommend class registrations or specific parking spaces, among other tasks. It may not sound like much, but any friction or key strokes you can remove from an employee’s day go a long way in their job satisfaction. Some examples of these technologies are CBRE’s HostWorkwell, and HqO, to name a few.
  5. Insights into property valuations and buying opportunities: The selling price for commercial real estate has many factors, so determining the best value for an asset, or highlighting underpriced assets, are great examples of where machine learning can add value. Most buyers use discounted cash flow and other financial models to help determine an asset’s current value, so the more accurate an assumption is on rent growth, occupancy, and market rents and demand, the better the valuation model will be. In 2018, there was more than $562 billion worth of commercial real estate transactions in the U.S. alone, and this large transaction volume offers a treasure trove of data and information. It’s a lot easier said than done, but companies like skyline and others are developing machine learning algorithms that offer investors and partners access to the sophisticated insights machine learning can offer.
  6. Computer vision: Computer vision is leveraging machine learning against images and videos for insights, and it’s an area that is early but one that will be very transformative for real estate over time. Computer vision is also used by robots that can navigate and monitor both indoor and outdoor spaces in various ways. There are many use cases in production today (I’ve lumped them together for simplicity) such as occupancy counts, identifying demographics of shoppers, security notifications on crowd gatherings, license plate and visitor blacklisting, employee building access, employee or tenant sentiment, and even early warning notifications to law enforcement when an active shooter first brings out a gun. Though there are real concerns about privacy when not used properly (a larger topic on AI ethics that needs its own summary), the technology is there and already in use. Companies like trueface, aegis, Knightscope, ambient ai, and Cobalt Robotics are just a few examples.
  7. Automating the lease abstraction process: Real Estate is one of the most document-intensive industries, so it makes sense to leverage machine learning to automate some of these unique processes. The lease abstraction process in particular, is manual with the non-standardized use of leases across the industry, along with the variation of terms and clauses found in every lease. By utilizing NLP, a form of machine and deep learning that analyzes words and context from history to take actions, the lease abstraction process can be augmented to improve efficiencies and lower expenses. Leverton was one of the early pioneers in the industry and were recently bought by MRI, while DealSum is another. However, most of the cloud platform players have advanced NLP capabilities, with Google being one of the leaders in this arena with their Document Sense product. We are very early in this segment, as labeling is complex and time consuming, but it is one that has high ROI opportunities for the larger firms with a high volume of leases.
  8. Automating the work request process: Most facility management applications can be cumbersome and time consuming to create a ticket, since many of the applications require multiple inputs (location, request type, urgency, description, etc.) to process and assign the ticket. To improve and simplify the user experience, NLP models can leverage the words used in historical requests to automate the process, requiring only a basic description of the problem. Machine learning programs can learn from the language used to describe a problem, taking words like “water leak,” “broken handle,” “coffee spill” and other words used in previous requests to assist and automate the creation and assignment of the ticket. Most work order systems today don’t yet have this capability built in, but I’ve personally been involved in the development of similar work efforts that leverage some wonderful machine learning platforms like Google’s GCP AI products and Microsoft’s Azure AI.
  9. Leveraging chatbots to interact with tenants or employees: This last example of machine learning is actually one that is the most pervasive and real use case across all industries today. Known more formally as “conversational AI,” chatbots and virtual assistants leverage machine learning and historical data to automate the most redundant, typical and time-consuming requests carried out by employees. You’ve likely come across a chatbot while visiting a website, or maybe you’ve “chatted” with “someone” via web support, when in fact it very well could have been a chatbot. The beauty of a chatbot is that it’s always on and waiting, and it can handle the first level interactions that cover the majority of requests. Developed properly, they can escalate the issue to a live person if a question isn’t being answered or upon request by a user. In commercial real estate, chatbots have been deployed to answer tenant questions and resolve facility issues, with just two examples being the Bengie app from Building Engines and CBRE’s host.

This is just a short list of some of the machine learning use cases and companies being leveraged today, but I hope it provides insight into what is possible and the business value that machine learning can provide. Over time, these and other capabilities will become more mainstream in the commercial real estate technology world. For now, it’s the early innovators that are ahead of the game and leading the pack.  Are you one of them?

If You’re Not Leveraging or Considering AI in Some Fashion, You’re Already Behind.

It’s been said that data is the new gold. If you believe this critical concept like I do, then you should believe that artificial intelligence (AI), and machine learning in particular, will have a tremendous impact on the way we work. Though not new, machine learning is the next step in the evolution of data analytics and every company should already be looking at where they can take advantage of this transformational technology. If you aren’t doing that already, you’re behind.

Data Analytics too often is about looking backwards at what happened, while machine learning is about looking at the same historical data but with an eye into the future. It uses patterns in the data to help provide insight and recommendations and it can help automate tasks where historical patterns are a good prediction of the future. Machine learning is just one application of artificial intelligence, but it’s the most accessible and relevant as it applies to most enterprises. There is a lot of debate about how robots powered by AI might replace our jobs, but we’re still too early in the AI evolution to be worried about full scale replacement. Many firms are already taking advantage, augmenting their employees work and freeing up time for more knowledge based activities. It’s a long way before large sets of jobs are replaced, but AI can and should be used to augment and improve processes, while also enabling personalization into how each employee or consumer works or shops.

Machine learning also opens up the world of prescriptive analytics, which takes predictions a step further by suggesting actions based upon the predicted event. Just because you know something will happen doesn’t mean you’ll take the most meaningful action. Scale also becomes more attainable with AI. The augmented work, insights and predictions open a world of doing much more with less.

If you’re still figuring out how to get started, then don’t worry. Foundationally, it’s much easier today to start than it was just one year ago. The cloud has accelerated machine learning adoption and accessibility and it’s a perfect environment for machine learning. You can get a lot of compute for specific periods of time for model training, while the number and quality of the model’s available increases daily. All the major cloud vendors now make it easier to tap into their models no matter where your data resides, while also providing open API’s for integrating the results into usable forms. There are also a multitude of vendors available that can help you start with a small project as you begin your learning exercise. Like other new technologies, it’s smart to start small in the form of testing, learning, trying and discovering.

When looking at where to begin, think about what business problems could be solved by automating a routine task. Look at problems where understanding historical trends can improve decision making. As you approach the project, keep the Agile methodology in mind; identify the business problem, pick a short-term win that can be accomplished in 4-6 weeks. Take the Minimal Viable Product (MVP) approach and don’t try boiling the ocean on this. You need to try it, learn, iterate, and go through it again.

Some examples of where to look are:

·       Repetitive tasks – Are there tasks that staff perform that are routine that also generate transactions or other data sets? Do you get requests that are repetitive and where history is a good indicator of how to process these requests? Is there data movement between systems that are routine?

·       Providing insights – Are there key processes that are event based, where you might improve the outcome the next time if you had historical data that provided insights into the quality of your decisions?

·       Reducing Noise – Operationally, do you have notifications that generate more data than you’re able to easily sift through? Machine learning can help you get through the noise and clutter.

·       Improving the user experience – Are there processes in your operations where your employees are required to go through multiple steps to get help or generate requests? There are many use cases where machine learning can help employees get what they need quicker, easier, and cheaper.

·       Personalization – Leveraging user preferences and habits, the user and employee experience can be personalized to provide a better and engaged experience. What temperature do you like it in your workspace? Which conference room do you tend to use (or what is the most convenient). What food do you like the most, giving you alerts when it’s available in a nearby café?

·       What’s in your data? – Look at where you have a lot of data. Just looking at the systems or devices that are generating a lot of data can give you ideas for where to look. The more data you have to train and test the model is important, so start with what drives your current data analytic requirements and you’ll likely get ideas from there.

One last note about your data. I’ve written many times about data quality and the importance of data governance, and the topic of machine learning is a great example of why that’s critical. Hopefully you have a good data governance program in place and your data is somewhat clean. Data quality will be key to a successful AI pilot. Having said that, you also don’t need perfection to start. Just by starting an AI project will give you insights into your data and hopefully get you on a journey of organizing and improving the quality once you understand what you really have.

If you haven’t thought about starting an AI program, then you better start soon. I bet your competitors are already on the journey and you’ll soon be left behind.

CRE Tech 4.0 – Trends (Still) Here to Stay

Last fall, I posted my thoughts on what was happening in the accelerating Commercial Real Estate Tech (CRE Tech) investment world and the megatrends influencing it, where I stated that this CRE Tech 4.0 hyper-investment cycle would have a more lasting effect than the previous rounds. What’s amazing is that the last 6 months have truly been a microcosm of everything that’s unique and different within the CRE Tech world these days. Last week, Fifth Wall went public with their Real Estate specific venture capital firm, announcing their $212 million fund that started investing last fall.  CBRE is a major LP and strategic partner, along with Hines, ProLogis, Equity Residential, Lowes, Host Hotels, Rudin, Lennar, and Macerich.  On top of Fifth Wall’s fund announcement, the tech merger and corporate tech buying spree has gotten heavier.  In the biggest CRE merger in a long time, VTS and Hightower merged in November. I got to know Nick, Brandon and both of these teams well while at Shorenstein, and I always believed they’d be stronger together. Though not a true liquidity event, the merger was a very positive sign for the industry.

Then in January, The News Funnell purchased the CRE // Tech Intersect conference brand. Also in January, CBRE purchased Floored, an innovative company that let’s owners and corporations visualize, model and collaborate on their office build-outs of every size and taste.  I also got an opportunity to see Floored’s potential a few years ago when we were preparing to completely remodel a large office building in Houston. The product seemed perfect for letting prospective tenants visualize how a space could look 18 months in advance with different fits and configurations, and what the views would look like from various floors and directions.  You could even grab an Oculus Rift and feel immersed in the space while sitting in your chair.  CBRE purchasing the company was yet another eye opener on what’s different now in this investment cycle.  Lastly, just last month CBRE made another purchase, acquiring Mainstream Software, a leading SaaS based CMMS application.

I can also say with first hand experience that the interest in these new technologies from the commercial and corporate real estate community is broader and deeper this time. Owners, investors, occupiers and everyone else involved are feeling the change all around them as tenant and employee working environments, expectations, and tastes are changing, while whole asset classes are being transformed.

In my previous post, a big theme was the CRE Tech 4.0 story, which highlighted the latest wave of CRE tech investing, while keeping a wary eye on the littered past. In line with the large amount of investing in startups overall, the CRE tech investment levels have skyrocketed over the last few years.  Real estate technology investments across all asset classes ballooned to $2.7 billion in 2016, compared to $451 million in 2013. If you consider that 2013 was a record year by miles, the money flowing in has been astonishing. But with a history of few survivors from the previous cycles, the question was whether this cycle will be different. I stated then that this was different, and that belief is even stronger now. As Jim Young recently pointed out in his recent posting, there are both positives and negatives with this recent cycle. I still see risks in the disparity between the dollars flowing in and the exits, but the investments have helped push the CRE tech industry forward permanently. There will be a correction and many of these companies will fail, but the best will survive and the industry will be better off.

The other part of my earlier post focused on the disruptive trends that are taking hold within CRE.  Those influences have only gotten stronger in the last 6 months.  The categories are still the same; artificial intelligence, robotics, autonomous cars, AR/VR, IOT, data analytics, and blockchain, but the influences are more profound. These are all trends that are also included as part of the “Fourth Industrial Revolution”, a book and related articles from Klaus Schwab and the World Economic Forum. I highly encourage anyone who is interested in these trends and their influences to read the book. In this new industrial revolution, we are moving from the basics of computers, the internet and software automation, to more advanced cyber physical influences.  These megatrends are also highlighted by Jim Young and the Realcomm team as the next CRE Tech wave, or phase 5.  No matter how you classify them, these are the trends that will have long lasting effects on how we work, play, travel, interact and live, and they will impact society at large.

Related to the Commercial Real Estate industry, here is an updated high level summary of the trends that are taking hold. Pages could be written on each subject, but I’ll just touch briefly on their CRE impact:

  • Artificial Intelligence – AI continues to garner the most attention with a significant amount of investment flowing in. The CRE use cases are endless, particularly as you combine it with IOT.  Whether it’s automating lease abstraction, predicting building system equipment failures, automating first level security monitoring, analyzing occupancy foot traffic, the advent of bots and conversational AI, or learning from tenant or client feedback, we’re early in this journey. Leveraging machine learning with sensors and IOT, robotics is growing, as are autonomous cars which will have a profound effect on commercial real estate.  Specifically within AI:
    • Machine Learning– Leveraging massive amounts of data and the computational advances powered by GPU’s, machine learning algorithms trained by data scientists can help owners and occupiers make better decisions on where investments should be made, how space can be better utilized, where personalization might increase customer engagement, what equipment should be replaced in advance of disruptive failures, and how bots might improve customer support. The hurdle though for many companies is not only the amount of data needed to make informed decisions, but the quality of the data. As I stated in my Data Quality posting recently, your results will only be as good as the quality of your data. This can’t be underestimated. Though a few real estate companies are now getting their feet wet with AI, every company should be starting down this journey, looking for pilots where they can learn how to best leverage these technologies throughout their organization.
    • Robotics – Last year at Realcomm, a great deal of attention was on robots. They were found not only in the exhibit area, but they were around many of the session rooms and throughout the halls.  Since then, I’ve had the opportunity to see a few of these in action, and I also had a chance to speak with the Knightscope team. Their current product is focused mainly on security, but there is new interest for customer service support at malls. Capturing 90Tb of data/year, it can recognize 300 license plates a minute, it has sensors with a range of 300 feet, it can grab MAC addresses for anomaly detection, and it learns via IBM Watson to improve interactions. While this use case is more intended for office campuses, the general AI powered robot is here to stay.
    • Autonomous Cars – Driverless cars and their effect on real estate in general continues to generate lot of attention.  The reality is that they’re already here in some form, so it’s no longer an if.  I previously mentioned Uber’s purchase of Otto and their roll out of pilots in various cities, along with NuTonomy’s autonomous car launch in Singapore. Again, a lot has changed in 6 months. Otto did a beer delivery test run, Apple applied for an autonomous car permit, Uber continued more roll-outs, autonomous vehicle startups continue to pop-up everywhere, and on it goes.  The industrial and shuttle use of driverless cars will become more prevalent quicker than on the consumer side, but autonomous vehicles in some fashion will be mainstream sooner than later.  When self-driving cars become the norm, what will that do to all the parking lots in urban areas and office buildings? Gensler and other design firms are recognizing this, and new office building designs are starting to incorporate the eventual reclaiming of their garages and parking spaces.  What about commuting patterns? Will people become more accepting of longer commutes, and will this push up rents in suburbs?  Assisted living and multi-family communities will also be impacted as the elderly take advantage of this new freedom and communities may rely less on car ownership.  Industrial hubs will change as driving patterns shift with autonomous driving trucks. This is just the tip of the iceberg as a major real estate disruption is ahead.
  • Augmented Reality and Interactive Software – This isn’t brand new to CRE, and it’s becoming a mainstay in residential real estate, but the tools are still in the infancy. Floored was one of the first to demonstrate the value with CRE, but others are pushing these technologies into the local design process, providing an enhanced collaboration process within construction and space design. Some, like ECCO are looking at interactive software to help find buildings in a community that appeals to selective clients. Augmented reality will help facility managers “see” manuals connected to their equipment, in addition to leveraging remote, visual support. I also believe these technologies will transform how we all work and collaborate in our everyday jobs over time. This is the “Future of Work”.
  • IOT – The Internet of Things (IOT) has been around even longer, but because of the fragmentation of the solutions previously available, the short-sighted ownership mindset and the varying levels of user sophistication, there is still a lot of untapped potential. As the cost of sensors continue to drop, the implementation of industrial IOT will continue to accelerate. Security still needs to be addressed, but that will come over time. Today, energy monitoring and management are the biggest uses, but asset inventory, reactive maintenance alerts, better preventative maintenance schedules, enhanced employee experiences, and occupancy cost forecasting highlight just some of the other areas ripe for change.
  • Data Analytics – New startups are coming up that not only focus on faster data storage and retrieval, but also on how to make the information actionable, meaningful, and usable in the hands of the business user. I see industry specific alternatives cropping up that let you hit the ground running within industry domains.  A pre-set understanding of “space” or “leases”, as an example, is a big jump over starting from scratch. It’s universally agreed that data is now the new oil, an asset that is extremely valuable and highly worth investing in.
  • Blockchain – Here too there have been many developments over the last 6 months. In October, Cook county announced a pilot with Velox for property transfers. In February, the Republic of Georgia committed to using blockchain to validate property related transactions. There have also been new consortiums announced on the IOT/smart contract side of blockchain, a category that I personally think will become more pervasive sooner than some of the other areas. It’s still early, but the blockchain is being tested and adopted in the financial industry and there is a significant place for it in the real estate industry.  Think about the titling process or the end to end transaction process. There are huge inefficiencies built into today’s model and anything that removes barriers adds value to all parties.  Unfortunately, this is one of those situations where you do need traction before you can really make great leaps forward for some use cases. Additionally, all those legacy records need to be accounted for. Still, having seen firsthand how long the commercial buy/sell process takes, there is a need for it in real estate with a potential remedy available for reduced speed and greater efficiency. Note that the technology itself also has an unfounded stigma attached to it as many people still equate the blockchain only to bitcoin. The bitcoin was just the first major use of the blockchain distributed ledger, but the scenarios spelled out here don’t face all the same issues.

For those of us who have lived through previous CRE Tech cycles, it’s understandable to be cautious. However, the nature of the technology in this investing cycle is much different. It’s much easier and less capital intensive to start a company today.  Fully leveraging the cloud and the lower cost of capital that comes with it, attacking a problem is quicker and more efficient than in the past. That’s exactly what VTS and Hightower did. The two poster children of this latest cycle were both able to quickly address a need that was screaming for help. Companies realized that they didn’t need to just rely on Excel, and the improved user interfaces and simple approaches were leaps and bounds ahead of the current options. Add in the megatrend influences affecting all industries, and the hundreds of other startups that have sprouted up with shoestring capital budgets, you get a real ecosystem of quality companies that are addressing real needs today.

The real question is what will happen to these companies in 3- 5 years?  Will they survive a downturn in the economy?  You can count on one hand the number of CRE startups that have gone IPO, so the founders need to either be content in staying private, merge with others (VTS/Hightower), or get bought by the big Corporates (Floored) if they want to continue their growth trajectories. The IPO route is unlikely, so we’ll more likely see a wave of consolidations as this growth cycle matures and the founders look to either cash out or further scale their business opportunities. In either case, with an abundance of quality companies gaining attention, the advances in CRE tech are still here to stay and we’re all better off for it.

 

Data Quality and Ownership is the foundation of all BI Initiatives

datagovernance_ris_03-09-15

How many times have you looked at a report or dashboard and you quickly question the accuracy of what you’re looking at?  If your first reaction is that there is a problem with the software, then you’re looking in the wrong place.  Most Business Intelligence (BI) initiatives fail not because the software isn’t right for the job, they fail because people don’t pay enough attention to the data quality.  To make information actionable for business improvement, the real goal of any BI initiative, you need to have a good data governance program in place.  Only when you’ve got a good handle on the raw data can you turn your attention to how the information is presented.  Taking it another step further, you can’t even begin to think about advance analytics and machine learning without top quality data.

Most companies end up having a data quality problem before they even realize it.  If you’re in a consumer facing business, how many applications do you have that contain the same information about your customer?  Sales will have one set, your account record system another, add in your customer service application and your billing systems and you can see where this can get out of control.  Unless these are all in one application, you’ve got a data quality issue.

In the Commercial and Corporate Real Estate world where I spend my time these days, organizations have multiple systems that contain building information, tenant, square footage, rent, lease information and headcount, just to name a few.  If you’re a rapidly growing company, your first priority is just finding space for your workers and getting your product out the door.  Customer service and time to market are your focus, not the data mastering of your real estate systems as an example.

No matter your  business, you need to setup a good data governance program before embarking on a BI or data analysis initiative for all of your systems. In particular, you need to:

  1. Identify your key data points that are the most important to you.  You shouldn’t boil the ocean with every piece of data as a start, so determine what are the key pieces of information that you care about the most.  This can be obtained from most of the reports you look at today, or the reports & dashboards that you would like to see.
  2. Most likely, some of that information will be in more than one application.  Determine which system will be your master.  This is typically the system that houses the ongoing changes of your key data points.  If it’s a customer, then it’s likely your account management system.  For a building, it’s usually your space planning application where CAD drawings and other up to the minute changes reside.
  3. Determine who owns the data.  The owner is the group that has the business ownership of the data.  In many cases, this is not who is maintaining the data as that’s done as a service by other departments or organizations.  This is one of the most important steps in improving your data.  Without a recognized and agreed to owner, no one is stepping up to ensure the data is accurate.
  4. Once you’ve got an owner, then you can determine who is best to maintain it.  This “data steward”, is responsible for ensuring the information is maintained as accurately and timely as possible.  They work under the direction of the owner when conflicts or questions arise.
  5. What’s your process for ensuring the data stays clean and is updated in a timely manner?  What’s the process for resolving conflicts if there are questions?  Data Governance is not a “create it and leave it” program.  It needs constant nurturing as new data points are added, new systems included or replaced, or new business needs.

With respect to Machine Learning, data quality is a fundamental necessity.  If you feed an algorithm bad data, you’ll get bad results and you won’t even know it.  Don’t even go there unless you’ve got a good foundation with data governance at the core.

There are more tools available today that support a good data governance program. The tools can help highlight inconsistencies, duplicates and anomalies that require attention, and they can be a valuable aid in assisting your data stewards or analysts. Still, these tools should not be deployed unless you’ve got a good program setup, with a top-down organizational buy-in.

Data Governance is not a sexy concept and many organizations don’t take the extra time and effort in focusing on getting it setup. If you’re serious about your Business Intelligence initiative, don’t be that lazy organization. Take the time and effort upfront, and you’ll end up with a more successful BI program in the end.

%d bloggers like this: