The real reason most data projects take too long to deliver value

Data projects succeed when teams focus on speed over scale, anchor work in clear business value, and build pragmatically by reusing what already works to deliver impact faster

Why data projects often stall without delivering business value

Every organization wants to become data-driven. And yet, the same story plays out repeatedly: a data warehouse gets built, dashboards get delivered, and months later, very few people actually utilize them. The project quietly becomes a forgotten tool. So, what went wrong?

The answer usually isn’t the technology, but everything that happened, or didn’t happen, before the first line of code was written. At Cuesta we repeatedly see five interrelated issues that stall data projects.

1. Starting too big

The most common upstream failure is defaulting to maximalist thinking and trying to do it all in one large data project, like integrate all data sources, design the entire canonical model, gather requirements from everyone, and deliver a complete dashboard with alerts and multi-channel distribution. This approach feels comprehensive–teams worry they’ll miss something or have to rebuild later, but it delays the path to value.  By optimizing for completeness over impact, projects push value realization so far into the future that momentum dies before anything ships.

Fix: Identify the single business question that matters most right now and build only what’s needed to answer it. Expand from there once it’s working.

2. Starting with data, not value

Most data projects fail to anchor themselves to a clear business outcome from day one. There are three main value drivers that a data initiative can impact: revenue increase, cost decrease, or enterprise valuation (often via risk reduction). If your team can’t explain which of those levers they’re pulling, then the dashboards and datasets you’re building can be solutions in search of a problem. Even as the final product is built, executives may not insist that their teams use your analytics if they do not understand how it will help them. A clear business value hypothesis that is well communicated is one essential key to success.

Fix: Before writing a single line of code, document a one-sentence value hypothesis that names a specific lever (revenue, cost, or risk) and how the initiative moves it.

3. Treating product management as optional

Without clear business value, under-investing in product management is often the next issue to arise. Assigning a part-time “product owner” role inside a scrum team is often insufficient. A product manager on a data initiative needs to be accountable for: understanding the problem (user/market insight), defining the solution (prioritization and specification), enabling delivery by working with the engineering team, and closing the loop (did it work, what did we learn, what’s next?). This ensures the value is translated through from the executive level down to the users.

Dedicated product managers also need to understand the technology well enough to have honest prioritization conversations and be able to say “Yes, when a stakeholder requests a new feature. Referencing back to a prioritized backlog, PMs must be able to steer the client on a fixed timeframe to deliver value and suggest that adding a new feature may mean shifting something else to a future sprint. This requires understanding the big picture to steer the project on course in addition to the small details at the column level to ensure successful user acceptability. If product owners don’t know the details, they will lose the trust of the users who do.

Fix: Assign a dedicated product manager who owns the triangle of strategy, user engagement, and change management for the full duration of the project.

4. Not triaging data quality pragmatically

One key question to ask when assessing data quality is: Which problems are actually causing pain? and what is it worth to fix? For example, 80% clean data that covers 95% of revenue might be completely sufficient. Chasing the remaining 20% past the point of diminishing returns often ends up being more of a budget decision than a technical one. When a quality problem is worth fixing, first figure out if it is a data problem or a process problem. Some issues can be resolved on the back end through fuzzy matching or deduplication, while others may need more serious process changes. The goal isn’t perfect data. It’s data that’s good enough to drive decisions to realize value. The challenge is knowing where that line is.

Fix: Conduct a cost-benefit evaluation to triage which data quality issues to prioritize that will have the biggest impact on the business value lever.

5. Rebuilding what already exists

Before building new master data sets or cross-reference files, look for what already exists inside the organization to use first. This could be a master data set that may already be maintained by someone in the firm. For example, a timesheet system probably already maps employees to projects while HR has org hierarchies. A simple Excel mapping file in SharePoint, maintained manually, is a perfectly valid starting point until you mature into needing cross-references in a formal system.

Fix: Run a “what already exists?” audit before any net-new build. Assign someone to interview ops, HR, finance, and IT specifically to surface unofficial but functional data assets.

In short, the organizations that deliver data value fastest aren’t always the ones with the most sophisticated infrastructure. They’re the ones that start small, stay grounded in business outcomes, invest in product management, approach data quality pragmatically, and resist the urge to build what they can simply reuse.

Background Image

Technology Doesn’t Wait. Let’s Start the Conversation.

You want to achieve all your goals. We want to hear about them. Let’s talk about the future of your technology.