On Sunday, Public Health England admitted that COVID-19 infections had recently been under-reported to the tune of 16,000 cases. The cause? A botched Excel import.
As MHR Analytics Senior Vice President, Nick Felton highlights, this very public failure demonstrates some vital lessons for all organisations. Not least, in 2020, “primitive, low-cost technology such as spreadsheets” have no place in critical and often complex data operations.
At first, Public Health England stated that due to a “technical issue”, the number of Covid infections had been under-reported by 15,841 cases. These missing positive tests were conducted between 25 September and 2 October. The absent figures were added to the national statistics on Sunday evening.
The reporting failure meant that in the run-up to the correction, official figures painted a misleading picture. It was clear that infection numbers were climbing; but the rate of escalation had been significantly under-represented. The delay in feeding this information through to NHS Test and Trace led to a subsequent delay in efforts to notify the contacts of those who had tested positive for the virus, in some cases by around a week.
Subsequent reports from the BBC shed further light on what had happened. It seems PHE have an automated process in place to pull nationwide infection data into Excel templates. This data is then uploaded to a central system where it is made available to Test and Trace and other government departments.
For this process, developers had picked an old file format - XLS. It meant that each template could only handle about 65,000 rows of data. Since each test result consists of several rows of data, it meant that each template was limited to around 1,400 cases. When the total was reached, any further cases were simply left off.
Nick Felton points to a fundamental failure to utilise suitable data processing tools in this case. “The fact that a primitive, low-cost technology, such as spreadsheets, is being used to track something as vital as Covid-19 cases is quite staggering when up-to-date sophisticated data analytics is widely available”.
In both the public and corporate sectors, the limitations of spreadsheets are all-too well known. As Nick puts it:
“Many Excel users are familiar with this scenario: you enter your data into a shared spreadsheet; you try to save it and receive a notification that the workbook is currently in use. You decide between closing out and losing all your input or saving a second version of the file and promising yourself to go back and merge the data later. That never happens, and your team ends up with multiple copies of a spreadsheet, each one carrying a portion of the truth. This should never be happening with such critical information that has such an impact on public confidence”.
Lessons for organisations
The absence of any real error control, extremely limited automation and problematic scalability: these longstanding issues should give organisations cause for concern if they are continuing to rely on Excel for data storage and processing.
Martin Rogers, Head of BI Solutions at MHR Analytics sets out the basic requirements for a fit-for-purpose solution:
“Under no circumstances should an Excel spreadsheet be used as an alternative to a database/storage option. Databases provide central security, indexing, backup & restore functionality, can scale indefinitely based on hardware, and finally, are less prone to corruption”.
In 2020, Missing rows, botched exports and an incomplete picture of performance have no place in any organisation. To develop a roadmap for finally moving away from your legacy systems, speak to MHR Analytics today.
All media enquiries should be directed to email@example.com
RECENT NEWS & VIEWS
TRY OUR SOFTWARE
With data constantly flowing in and out of an
organisation, it's important to establish repeatable processes to build and maintain standards for data quality. We’re the experts, get in touch.
…DON’T GET LEFT
BEHIND IN A