Data Strategy Fundamentals: Data Collection and Validation 

By Tyler Christensen, Head of Industry Solutions at Cherre

Last night, the aroma of eggs in purgatory filled our home. It’s warm, comforting, and a little spicy – a family favorite. As we sat around the table to eat, it struck me how similar crafting a robust data strategy is to making dinner.

Cooking eggs in purgatory requires key ingredients: eggs, tomato, onions, garlic, salt, and pepper. But what if dinnertime comes and I’m missing something critical? Perhaps I overlooked buying tomatoes, assuming there was a can in our pantry. If so, I’d need to make something else – or make nothing at all and go hungry.

In much the same way, incomplete data leaves you hungry for insights. Even worse: If I used rotten eggs because I didn’t take the time to check them for cracks, my recipe isn’t going to just not turn out well – it might make my family sick.

Just as rotten eggs ruin a recipe, rotten data sickens downstream processes like underwriting, reporting, financial modeling, dashboards, and AI models. 

No matter how advanced your language model, how elegant your reporting, or how dazzling your dashboards, your efforts will fall flat without a way to efficiently bring trusted, consolidated data into your system.

For your enterprise to make decisions with confidence, you need high-quality data. Cherre defines data quality with three C’s: Consistent, Complete, and Correct.

  • Consistent: Is your data populated the same way every time? (Perhaps a company’s parent name is sometimes listed as the tenant name whereas other times it’s listed as a specific brand.) Do the data mean the same thing to everyone who uses it? If this data is entered, are the fields used the same way by all?
  • Complete: Does your data come with all the fields you need? Or are fields populated only in certain cases? For example, have your providers submitted data that’s missing required account codes?
  • Correct: Is the data accurate? Or does it conflicts with other sources at your disposal or even with what you know to be true? Consider a scenario where a set of comp data you acquired suggests that the average rent per square foot in a specific submarket is significantly higher than figures reported by alternative comp sources and your own internal records

Your ability and approach to managing the quality of your data depends on its origin:

  • First-party data: Internal data created by your organization.
  • Second-party data: Data that is yours but not created by you. This data is usually created by partners such as your JV partners or property managers.
  • Third-party data: Data sourced from other providers, such as market or comp data. This may be data you subscribe to or data in the public domain.

Cracking data quality issues in first- or third-party data is often fairly straightforward. For first-party data, you can enhance your processes, retrain your employees, or update your system’s required fields. Low-quality third-party data can be swapped out for a new source entirely.

But second-party data is more challenging.

Your providers use tools, processes, and systems that are different from yours and also different from your other providers, which makes it challenging to validate and aggregate data to your standards. This typically results in an error-prone, manual process of data mapping and back-and-forth with providers to resolve issues.

Although you typically can’t require your providers to use the same systems or processes as yours (or each other), you can enforce standards upon the data they submit to ensure that it is consistent, complete, and correct.

That’s where a tool like Cherre’s Data Submission Portal comes in.

Cherre’s Data Submission Portal automatically validates, maps, and aggregates partner-submitted data against your business rules, providing complete visibility into your submission cycle from submittal to approval.

Providers receive instant feedback as soon as they upload their data, allowing them to identify and fix their own errors. The portal automatically checks for data consistency, completeness, and correctness, identifying common problems like new GL accounts, chart of account mapping changes, and prior period entries.

Once you’ve mastered the submissions process with processes designed to crack those rotten data eggs, you can turn your attention to what matters: Driving trusted insights to the decision-makers who drive your business. (And, perhaps, trying a new recipe.)

Cherre’s Tyler Christensen is real estate technology strategist, with roots in finance, asset management and portfolio management. As our Head of Industry Solutions, he leverages his data and domain expertise to help our clients future-proof their data ecosystems and unlock value from their data.

Get Clean, Accurate, Consolidated Data Faster Than Ever

End the endless back and forth with your providers. Accelerate your enterprise with trusted, accurate data through Cherre’s Data Submission Portal – a fully configurable tool to instantly validate and consolidate partner data with unprecedented flexibility and transparency.

Get a Demo