How to Prove ROI Before Scaling AI
Most data governance business cases rely on vague “risk mitigation” arguments and hope for the best. But Michael Weiss, AVP of Product Management at Nasdaq, had a different approach.
“Instinctively we knew we had challenges based on feedback and word of mouth,” he explained at Atlan Re:Govern. “But that doesn’t build a foundation for a solid business or investment case.”
So his team ran a survey of primary stakeholders. What they discovered changed everything: 80% of users were spending 5+ hours per week trying to discover assets. Not analyzing. Not building. Just finding what exists.
That alone unlocked Nasdaq’s entire governance investment.
Like Michael, leaders from Invitation Homes, EasyJet, CME Group, and General Motors took to the stage at Re:Govern to share how they’ve cracked one of the hardest challenges in data: selling the invisible work. Not with platitudes about data quality or generic ROI projections, but with hard numbers, measurable outcomes, and business cases that boards actually understand.
Here’s the blueprint for proving ROI before asking for budget – and why it matters more than ever in the age of AI.
Why the traditional business case fails
Permalink to “Why the traditional business case fails”“Governance is important” is an unconvincing pitch to secure funding. And yet, most governance business cases still lead with compliance risk and regulatory requirements.
The problem is that compliance is table stakes. It doesn’t differentiate or demonstrate value creation. And in the AI era, when every CEO wants results now, “we need to avoid fines” doesn’t compete with “we can deploy AI that drives revenue.”
Chris Durham, Director of Enterprise Data Products at Invitation Homes, recalls how before implementing proper governance, five analysts would produce five different answers to the same question. “We had doubts,” Chris said. “Is this created the right way?”
That’s a business problem, not a compliance problem. And business problems require business metrics.
The 3 ROI metrics that matter for data and AI
Permalink to “The 3 ROI metrics that matter for data and AI”Metric 1: Time to discovery
Permalink to “Metric 1: Time to discovery”What Nasdaq measured: Before investing in governance, Nasdaq surveyed their primary stakeholders and found that 80% of users were spending 5+ hours per week trying to discover what data assets existed.
“Time equals money,” Michael explained. “You’re taking high-value resources and having them just figure out what they have available to start doing the job they’re trying to do.”
How to measure it:
- Survey stakeholders on the current state: “How long does it take you to find the data you need?”
- Quantify time waste: Hours per week × hourly rate × number of affected users
- Set a measurable target: “We want to reduce discovery time from five hours to 30 minutes”
- Monitor post-implementation: Track actual time to discovery in your governance platform
The result: Users now find assets in minutes, not hours. The business case was justified on this metric alone.
Why this matters: In regulated industries like financial services and healthcare, high-value analysts and data scientists shouldn’t be excavators. When 80% of your team spends their time hunting for data, it doesn’t just waste time – it significantly delays AI implementation.
Metric 2: Time to resolution
Permalink to “Metric 2: Time to resolution”What Nasdaq measured: Breakages are inevitable. It’s how they’re resolved that counts – and for many data teams, that’s not always a pretty picture. For Nasdaq, the average resolution time pre-governance was eight to ten hours.
“Unfortunately, despite all our best efforts, data doesn’t behave in the wild. Things do break,” said Michael. “And one of the big things we found is that we weren’t resolving issues as quickly as we’d like.”
How to measure it:
- Track current mean time to resolution (MTTR) for data issues
- Identify why resolution is slow:
- Can’t find ownership?
- Don’t understand lineage?
- Can’t assess impact?
- Implement governance capabilities
- Track actual MTTR reduction post-implementation
The result: After improving data lineage visibility, assigning clear ownership, and proactively detecting issues before they reached production, Nasdaq’s average resolution time dropped to two to three hours – a 70 to 80% reduction.
Why this matters:
In financial markets, minutes of downtime can mean millions in potential fines. In customer-facing scenarios, bad data equals lost revenue. As Michael noted: “Presenting misinformation or bad information to end users is not ideal.”
But in the AI era, where agents operate autonomously, resolution time isn’t just about fixing broken dashboards. It’s about preventing AI from making expensive decisions based on flawed context. Thinking of resolution as a proactive tactic instead of a reactive task helps make it a business imperative, instead of something that belongs solely in the data team’s domain.
Metric 3: Adoption and self-sufficiency
Permalink to “Metric 3: Adoption and self-sufficiency”What Invitation Homes measured: Before implementing a governance program, Invitation Homes had 45 separate data assets serving similar purposes. Chris’s team needed a way of knowing with confidence:
- Adoption rates: Are people actually using this?
- Convergence: Are we consolidating fragmented assets?
How to measure it:
- Quantify the number of active users: Not just licensed users
- Track frequency of use: Repeat engagement, not one-time logins
- Calculate the self-service rate: Percent of questions answered without data team intervention
- Determine asset utilization: Which data products are actually used vs. sitting idle
The result: After implementing a disciplined data products approach with proper governance, Invitation Homes is consolidating its disparate assets, using scorecards to track progress toward convergence targets.
“That’s going to relieve the burden off of our data engineering and simplify consumption from a reporting Tableau environment,” said Chris. “Now, the prioritization of data products is really driven by some of the leadership committees that we have, so we go in with a business case.”
Shivaprasad (Shiv) Nayak, Head of Enterprise Data, Analytics and AI Architecture at EasyJet, built on Chris’s guidance, advocating for starting small but strategically: “Start with one domain, prove value quickly. Show that value to the business. Use success to unlock the next domain.”
Why this matters: Platform cost is a small part of TCO – but if adoption fails, your entire investment is wasted. In the AI era, adoption becomes even more critical. If your AI agents can’t discover and understand your data products, you’re not AI-ready – regardless of how sophisticated your models are.
The blueprint: 4 steps to an airtight investment case
Permalink to “The blueprint: 4 steps to an airtight investment case”Step 1: Quantify the current pain (the “before” state)
Permalink to “Step 1: Quantify the current pain (the “before” state)”The best way to win investments is to speak the language of the decision makers. In every organization, there is some tension – some pain – that keeps leaders up at night. Identify what that is for the data issue you’re trying to solve, and use it as a starting point. Then, calculate the business impacts, using examples like these:
Data discovery pain:
- Survey: “How many hours per week do you spend finding data?”
- Calculate: (Hours × Users × Hourly Rate) × 52 weeks
- Example: If 100 analysts spend 5 hours/week at $75/hour for 52 weeks = $1.95M/year spent on data discovery
Data quality/trust pain:
- Survey: Do different analysts have different answers to the same question?
- Estimate: Number of potential decisions made on bad data
- Document: Downstream effects from decisions made on bad data, from customer churn due to operational inefficiency, to compliance violations from data mishandling
Resolution pain:
- Survey: “How long does it take to resolve issues when they arise?” (Tip: Be sure to distinguish between common data quality issues and critical outages)
- Calculate: Current MTTR × Number of incidents per year × Cost per hour of downtime
- Example for data quality issues: If 3 engineers at $100/hour spend 8 hours addressing 50 incidents per year = $120k/year spent on direct engineering costs. Add the downstream impact: Each incident blocks 200 analysts for 2 hours at $75/hour = $1.5M in lost productivity. The total cost for data quality issues comes to $1.62M annually.
Step 2: Project the business outcomes (the “after” state)
Permalink to “Step 2: Project the business outcomes (the “after” state)”Different leaders prioritize different success metrics. But one thing is certain: regardless of which metrics you’re measuring, make sure you can trace them back to business outcomes. Here are a few examples:
Cost savings from reduced time waste:
Data discovery:
- Current state: $1.95M/year (based on 5 hours/week finding data)
- Future state: 100 analysts × 0.5 hours/week × $75/hour × 52 weeks = $195K/year
- Annual savings: $1.76M (90% reduction)
Resolution:
- Current state: $1.62M/year (based on 8 hours responding/incident)
- Future state: 50 incidents × 3.2 hours × 3 engineers × $100/hour = $48K engineering + 50 incidents × 0.8 hours × 200 analysts × $75/hour = $600K productivity loss = $648K/year
- Annual savings: $972K (60% reduction)
Quality improvements from standardizing KPIs:
Invitation Homes established 350+ certified KPIs with clear ownership, definitions, and quality guarantees. Before, five different analysts would have five different answers to the same question; after, they had a single source of truth with quality assurance.
The impact of quality improvements can be measured in a number of ways, from less time and money spent reconciling inconsistencies to improved decision making speed. We’ll look at the savings from reduced rework and re-analysis:
- If 25% of analytics projects require rework due to data quality issues (conflicting definitions, wrong data sources, unclear transformations), and an average project consumes 40 hours of analyst time at $75/hour = $3,000 per project
- 100 projects/year × 25% rework rate × $3,000 = $75K annual waste
- Governance implementation drops the rework rate to 5% = $15K annual waste
- Annual savings: $60K in reduced rework (80% reduction)
Adoption gains that drive data utilization:
Relying on a few analysts with tribal knowledge is a recipe for disaster: Not only does it create a massive bottleneck, but there’s always a risk of an analyst’s context and knowledge exiting the business when they do.
VMO2 avoided that risk by prioritizing data democratization and championing enterprise-wide data marketplace adoption. That effort resulted in 6,000 engaged users generating 1M data product views. That massively distributed decision-making and eliminated bottlenecks entirely.
Chris Durham of Invitation Homes reiterated the importance of adoption, saying: “The most important thing about a governance platform is ensuring that you do drive adoption, ultimately because it is as much a technology decision as it is a people and process decision.”
- Current state: Data experts field 500 questions/month, averaging 30 minutes each = 250 hours/month at $150/hour = $37.5K/month, $450K/year
- Future state: 80% of questions self-served through governance platform, experts only handle the remaining complex 20% = 50 hours/month = $7.5K/month, $90K/year
- Monthly savings: $30K = $360K annually (80% reduction)
This has the additional value of enabling faster decision-making – with questions answered in minutes instead of days – and reduced frustration, so data experts can focus on high-value work.
Step 3: Build the investment case
Permalink to “Step 3: Build the investment case”With your before and after story in place, you can start adding up other ongoing costs that will help calculate true ROI. These include:
- Platform licensing (governance platform, complementary tools)
- Implementation services
- Internal resources (governance team, stewards, enablement)
- Ongoing maintenance and evolution
To calculate ROI, divide the total annual benefit (time savings + quality improvements + risk reduction) by total cost of ownership. This helps quantify the solution’s potential and set a tangible goal to work toward, which in turn helps with buy-in.
Step 4: The no-action case
Permalink to “Step 4: The no-action case”What happens if you don’t invest? The metrics are critical to consider (and present to execs), but painting the picture of a future without AI-ready governance can help expose new pain points.
“What is an AI strategy killer? Poor quality. No context. Lack of governance. Lack of a framework,” said Amie Bright, VP of Enterprise Data and Insights at GitLab. “All of these things are going to impact our ability to take advantage of AI.”
You can’t deploy AI without a governance foundation. While you hesitate, competition moves faster with better data.
“Consequences carry far greater business and reputational costs than the investment to get governance right,” summed up Rahul Bakhshi, VP of Data & Technology at New York Life.
The dashboard: What to track post-implementation
Permalink to “The dashboard: What to track post-implementation”How will you know if your implementation is on track to achieving the anticipated ROI? Monitoring various leading, lagging, and governance health indicators can deliver early signals and business outcomes.
Leading indicators:
Permalink to “Leading indicators:”- Number of assets cataloged
- Number of active users
- Number of searches performed
- Metadata completeness scores
Tip: Drive engagement by making context fun. General Motors gamified metadata completeness and classified 98% of its cloud data. See how here → [Link]
Lagging indicators:
Permalink to “Lagging indicators:”- Time to discovery
- Time to resolution
- Asset convergence
- Self-service rate (% questions answered without data team)
- Data product adoption
Tip: Track convergence, not just coverage. Invitation Homes consolidated 45 separate fragmented assets into single, trusted sources by measuring data product managers on both adoption (are people using this?) and convergence (are we eliminating redundancy?). Hear how here → [Link]
Governance health indicators:
Permalink to “Governance health indicators:”- % assets with certified metadata
- % assets with ownership assigned
- % assets with quality checks in place
- % assets with lineage mapped
Tip: Set your criteria early. At Mastercard, the data governance team worked directly with data stewards to define what metadata is required to be considered a “data product,” so that quality thresholds were established from the start. Hear their approach here → [Link]
Tracking these metrics consistently helps tell the story of governance maturity over time.
The business case template for data governance
Permalink to “The business case template for data governance”Once the data is flowing in, it’s time to put the business case in executive-ready terms. Here’s what to include:
Executive summary:
Permalink to “Executive summary:”- Current pain quantified: Discovery time, resolution time, quality issues
- Proposed solution: Governance platform + operating model
- Expected ROI: Time savings, quality improvements, risk reduction
- Investment required: Platform, services, internal resources
- Payback period and ongoing value
Supporting evidence:
Permalink to “Supporting evidence:”- User survey results: Where are users getting stuck and why?
- Industry benchmarks: Where do we have gaps?
- Vendor evaluation: Which platforms did we test and how did they perform?
- Pilot results: What did we test and how do we anticipate those results to scale?
- Reference customers: Which other companies of similar scale/industry have done this successfully?
Why ROI matters more than ever in the AI era
Permalink to “Why ROI matters more than ever in the AI era”Every speaker at Atlan Re:Govern emphasized the same reality: you cannot scale AI without context and governance foundations.
The business case for governance used to be about avoiding fines and reducing inefficiency. Today, the business case is about enabling AI.
Mastercard’s CDO, Andrew Reiskind, framed it perfectly in his keynote: we’ve moved from “privacy by design” to “data by design” to “context by design.”
Your AI agents need more than data – they need context. What does this field mean? Who owns it? Can it be used for this purpose? What’s the lineage? What are the quality guarantees?
Without that context, AI is operating blind. With it, AI becomes not just accurate, but transformational.
That’s the real ROI of governance: Not just preventing disasters, but enabling the future.
Watch the full ROI Before AI panel here.
Share this article
Atlan is the next-generation platform for data and AI governance. It is a control plane that stitches together a business's disparate data infrastructure, cataloging and enriching data with business context and security.


