One of the most basic premises of Data Integrity is to ensure the data in your project cost management system is Current, Accurate and Complete. We talk a lot about getting your project data in real time and how that leads to better decision making and the ability to take early, corrective action on issues as they arise. We also go to great lengths to discuss how the project cost management software can be configured to ensure the highest levels of accuracy upon data input by preconfiguring constraints, tolerances and ranges of acceptable values.
In this installation on data integrity, I want to talk a bit about what happens after that data has been input into the system. When it comes to project controls on construction projects, it’s critical for the project software to be on duty 24 hours a day and 7 days a week. Your teams will lean on the project cost management software to work tirelessly 24-by-7 to monitor the data, and perform continuous integrity checks, reconciliations, and benchmarking.
Data Integrity Checks
After any project data has been input – either in the field as daily jobsite data, or in the office as budgets, documents, change orders, forecasts, etc. – automated data integrity checks will be ongoing day and night to catch any issues that may have either slipped by the initial data validation on input, or that may have emerged through background synchronizations with other systems. Like, for example, if your project cost management software is integrated with other enterprise systems and synchronized on a regular basis. In these cases, external data entering the system has the capacity of modifying, overwriting, deleting or otherwise corrupting data in the system. Clearly, data checks are performed during the sync process, but errors and anomalies are always possible.
In addition to automatic synchronizations, data can also be manipulated in the background through user-driven imports, or data modifications from other background means. The key is, that these integrity tests the software is performing, are ongoing in the background to ensure persistent data integrity over the long term.
Data Benchmarks on Big Data
Part of that background activity is to run algorithms on the data to aggregate it into meaningful benchmarks, averages, trends, etc. i.e. Big Data. This is not only valuable information in the aggregate for users to take advantage of – like, when analyzing the data for broader trends – it can also be used to spot data anomalies and outliers that can be flagged as potential errors.
Your Project Software Working for You
The 4castplus project cost management system never stops checking and re-checking all data that’s been input in the software. It uses numerous intelligent algorithms to identify probable data integrity issues and either correct them on the spot (if applicable), or issue notifications to have them investigated further.