The problem with Excel-based monitoring

Success is based on our ability to learn and adapt. Yet our monitoring systems weren't designed for this.

Mark

Mark Winters Co-founder and CEO

Pulling Excel 2

If there’s one thing we’ve learned, it’s that sticking with predetermined plans doesn’t work – success depends upon our ability to learn and adapt. Our monitoring systems should add significant value here. And yet, in practice, most do not. Why?

Part of the answer lies in the tools we use. Most monitoring systems are built in Excel so I’ll focus there.

Let me first express some love for my erstwhile companion of 18 years. Most people know how to use it, so it’s easy to get started - familiarly reduces the learning curve. It’s flexible so you can customise your system endlessly. And it's also cost effective as most teams will already have access. If you have a few simple projects, and a limited budget, Excel might be a good option.

With that said, there big drawbacks to an Excel-based monitoring system. Here are my top four:

1. No quality control on data input

An effective monitoring system helps us capture reliable and consistent data, which is essential for identifying trends and drawing valid conclusions. The problem with Excel is that you can write anything into a cell - and people do. Indicator units evolve overtime. You’re expecting a number but get a little narrative instead. Disaggregated data doesn’t add up, or data lacks a specified timeframe. Excel’s flexibility is a weakness — its lack of ‘guardrails’ compromise data quality.

2. Woeful with words

A good monitoring system should handle both quantitative and qualitative data. While numbers are important, understanding the 'what,' 'how,' and 'why' of change requires qualitative data. And if we’re interested in the sustainability of the changes we promote, we have to understand how people feel about them. Excel is good with numbers but struggles with text. There are only so many cells I’m willing to merge. Many teams end up fragmenting their data across various Excel, Word and PowerPoint docs – a problem that grows overtime.

3. Doesn’t present data for decision-making

To support learning and adaptation, monitoring systems must present data in ways that make it easy to understand project progress. Ideally, they should organize and visualize data against Theories of Change (ToCs). This helps clarify the ‘what’s so’ and enable discussion on the ‘so what’. This is a crucial function for any monitoring system but for most, it’s absent. The reality is multiple spreadsheets, each containing several tabs. Data is presented as a wall of numbers. Field notes are somewhere, survey data somewhere else. In general, Excel systems do a bad job of bringing data together in ways that make it easy to understand progress.

4. Drag on adaptation

A good monitoring system should be the backbone of project adaptation, evolving alongside projects and portfolios. When new activities arise, or we need to track an outcome more closely, our monitoring should respond. It’s a sad admission but I’ve spent weeks (perhaps months!?) adjusting ToCs and monitoring plans in Excel. Adding, deleting, merging, and remerging rows. And unless you’re diligent, things get messy. And sometimes they break. Version control complicates the process. Instead of facilitating adaptation, Excel becomes a drag upon it.

***

I'm feeling like a fun-sponge so let me finish on a positive note.

We can develop better monitoring systems. Systems that help ensure data quality; that are good at handling quant and qual data; that support decision-making; and that serve as the backbone of project adaptation. Systems that support learning and adaptation and, ultimately, help us maximise our impact.

At Paths, this is our mission, and I'll discuss our approach in subsequent blogs.