Earlier this week, I read this short article in CIO Dive centered around the idea that a lack of data is the basic challenge in ESG and carbon disclosures. Based on a few decades of auditing E&S conditions, data, processes and systems – I agree with the article to an extent. However, the focus shouldn’t be on volume or sufficiency of data – it should be on appropriateness, relevance and validity. IT systems can be a huge help in collecting and managing data, but very often they can’t (or don’t) discriminate between bad, good or incomplete data.
Where Bad ESG Data Comes From
Developing and implementing good E&S data controls means first understanding data sources and paths to potential errors. In my experience, bad E&S data comes from a variety of situations including:
- Human error in data entry. People make typographical errors, accidentally transpose numbers or misread handwritten manually-logged information. Automated checks and balances can be programmed into IT systems to flag entries that appear outside of an expected range. That is certainly helpful, but there are also times when those reference parameters need be changed, so that requires vigilance – as well as someone remembering to do that when necessary.
- Not understanding the data request. Even with IT, humans need to interact with ESG data at times – frequently at its point of origination. Without even a basic understanding of what is being asked and the expectations of correct data parameters, errors are likely. In the realm of conflict minerals, I frequently encountered questions answered by administrative or temp staff who had no idea about product content. I remember a receptionist being given the task of answering whether or not the company funded armed groups in conflict affected and high risk areas – a very complicated question. This is perhaps a greater risk in the social realm than the environmental.
- Equipment failures. Monitoring devices like air emissions and wastewater flow measurement devices fail in various ways, producing false and erroneous readings that aren’t always obvious. Some equipment failures result in the monitoring equipment reporting artificially low/high numbers, a “0” or a text field (such as “ERROR”) that automated calculations simply ignore because data isn’t numerical – all impacting resulting calculations – especially averages. Measurement devices (such as pH meters) sometimes require daily calibration – without this calibration, readings can be increasingly erroneous over time.
- Modifications to equipment without making related changes to data collection/processing parameters. Air emissions calculations are closely tied to production levels which in turn can be greatly impacted by upgrades in equipment like motor size and conveyor sprockets. Changes like this don’t intuitively impact production rates, but they can. Those were some maintenance changes that took place in plywood mills at a company I once worked for – which contributed to the $120 million settlement between the company, EPA and Department of Justice. Maintenance systems log the physical equipment changes made, but the resulting operational upgrade may not be reflected in operational calculations.
- Errors in calculations/formulas. Even if E&S data is good at its origin, if the calculation is erroneous, the output will be too. It can be tedious to check air emissions calculations (trust me, I’ve been there), but it is an important step. With all the new IT systems getting into GHGs, I don’t know how many of them have their systems, assumptions, formulas and outputs verified by third party technical emissions experts.
- Changes in data collection forms/formats without making changes to the reporting output forms/formats. Nothing is static, especially these days in the ESG data and reporting world. As new standards, frameworks and assumptions are published, how E&S data is collected and reported has to keep up – and continue to match up. A change in one component of the process has to filter through the rest of the data collection, management and reporting processes.
- Wrong, bad or incomplete data source(s). If the source of your E&S data isn’t complete, credible or validated, it probably isn’t worthy of reliance. Remember the old saying “garbage in, garbage out.”
- Fraud and failures in controls. With the new importance of E&S, and its inclusion in some executive compensation metrics, comes increased motivation to make numbers or situations look better, show that something is being done that isn’t really, or vice versa. This can foster fraud or bypassing E&S data controls.
Minimizing Bad ESG Data
I’ve written before about E&S data problems and validation. This isn’t anything new. An emphasis on IT solutions can give a potentially false sense of security of error-free data. Companies must maintain vigilance and controls due to the variety of ways ESG data can go wrong – without automated systems knowing or flagging it. Companies should include ESG, operational and process subject matter experts in their internal audit and other data controls/validation processes. It is also prudent to dig into third party IT systems to learn how those manage data issues and double check the calculations/assumptions embedded in the systems. Understanding operational contexts, their linkages to data, where it originates and how it is collected is fundamental to having confidence in the data and resulting output.