Pricing: Estimating TCO and ROI for Your Business

Updated May 24th, 2024

Share this article is a cloud-based solution designed to streamline data discovery and governance for businesses. While offers a variety of pricing plans, determining the exact cost for your organization can be challenging.

Similar to other major software purchases, the total cost of ownership (TCO) goes beyond just the software licenses.

This article takes you through’s pricing options and the factors influencing its cost. The idea is to help you estimate the TCO for your organization, in order to make an informed decision.


Looking for a data catalog with an ROI you can present to your CDO? Atlan is designed for adoption and embedded with automation. It helps you save time, cut cloud costs, and make faster, better decisions that lead to revenue.

Get Atlan Pricing →

Table of contents

  1. pricing overview
  2. pricing: Estimating the total cost of ownership
  3. A Return on investment?
  4. Related Reads pricing overview

While doesn’t publicly list prices, we can piece together a clearer picture using information from their pricing page, AWS Marketplace listings, and other sources. Here’s a breakdown of their tiered structure, along with estimated annual costs based on AWS Marketplace data (though it’s always best to contact directly for a precise quote).

  • Essentials: Their most basic tier likely caters to smaller businesses or those starting their data catalog journey. It includes basic metadata management and Tier 1* integrations for an estimated annual cost of $90,000.
  • Standard: This tier builds upon Essentials, offering additional features such as Enterprise support & SLAs, full audit log history, Eureka Explorer Lineage, Data Governance Core, and Tier 2* integrations for an estimated annual cost of $120,000.
  • Enterprise: This tier is designed for larger organizations. It builds upon Standard features and provides access to Tier 3* integrations, on-premise data collection agents, and secure connectivity services for an estimated annual cost of $180,000.
  • Enterprise+: This highest tier caters to highly specific needs, especially regarding security and contractual requirements. It’s likely a custom package that can include advanced offerings like single-tenant installations, customer-managed encryption keys, and adherence to specific security certifications. Pricing for this tier would be based on your specific requirements.

All tiers includes 10 users, with additional user fees and volume discounts for larger teams. You can further customize your plan with add-ons, expanding functionality through additional integrations, extended data virtualization/federation, and sensitive data discovery. Notably, all tiers come equipped with’s AI assistants – Archie Bots, BB bots, and Hoots.

✔️ * for an understanding of which integrations are classified as Tier 1, 2 and 3, please check their official pricing information pricing: Estimating the total cost of ownership

While’s pricing structure offers various options and some level of customization, it’s crucial to consider the total cost of ownership (TCO) beyond the base license fees. Let’s look into the factors that can influence your final cost and help you determine if delivers a return on investment (ROI) for your organization.

For a more detailed breakdown of pricing factors you can visit our article on Data Catalog Pricing: Understanding What You’re Paying For

1. Upfront licensing costs

Licensing fee structures vary depending on the service provider. These usually depend upon several factors, including but not limited to:

  • Number of users
  • Number of data connectors used to connect to your data sources in data estate
  • Other product features that the service provider has deemed as “add-ons”

In case of, connectors are differentiated based on pricing tier. The lowest tier, Tier 1 supports only integration with cloud native tools like data warehouses and BI tools. To access On-prem/legacy integrations like SAP, Oracle, Terradata etc., the customers will have to opt in to higher tiers. offers add-on product packages to enhance the capabilities beyond the tier functionalities (e.g. extended data virtualization and federation). However, features like sensitive data discovery which are available as free in standard offerings of other catalog providers are priced as as an advance package.

Lineage is a critical feature for many customers that evaluate data catalogs as it powers root cause analysis, impact analysis, and cost optimisation.’s native lineage offering is reportedly not up to the mark, hence they rely on Manta (3rd party lineage provider) for providing lineage solution. To access this, a customer has to pay additionally for Advanced partner lineage.

2. Hosting costs

There are three basic ways you can consume a data catalog:

  • As a Software as a Service (SaaS) offering
  • As a cloud-based catalog deployed to a general cloud services provider, such as Amazon Web Services (AWS) or Google Compute Platform (GCP)
  • As an On-premise catalog, i.e., hosted in your own data centers

If computing in the cloud or on-premise, a data catalog requires devoted storage and computation. Depending on your hosting service, there may be data egress costs as you transfer between systems. An on-premise installation will also incur additional capital expenses in terms of server racks and other hardware. For SaaS offerings, hosting costs will generally be baked into the licensing cost. However, there may likely be additional charges for exceeding set capacity amounts (for example, a certain number of GB of data egress or metadata storage). runs as a SaaS product that the company deploys for its users.

3. Implementation costs

Implementation costs for data catalogs depend on how difficult it is to ingest metadata from all your critical sources. Often, this requires hiring system integrators (SIs) who specialize in the platform, and those professionals don’t come cheap. provides a wide array of metadata collectors that are designed to pull metadata from your systems. There are two ways of running these metadata collectors - cloud collectors and On-premise collectors. Typically, the feedback is that metadata ingestion seems to be smooth if the collectors are present.

4. People costs

The people costs are a function of the costs involved in customising and configuring the platform for the use cases, and training costs for end users to adopt the tool.

Though is a customisable platform, the customisation efforts required are cumbersome and requires skilled individuals, as per many reviews in public feedback. Here are few specific examples -

  • leverages SPARQL language for querying its knowledge graph to extract metadata insights. However, SPARQL language is not commonly known to end users. The user quoted “It’s challenging to find training for the SPARQL query language used to create such automations.” Another review highlighted that it is less appropriate for “For those unfamiliar with database concepts and SQL”
  • lacks personalisation to cater to different user personas. To overcome that, customisation is key. However, customers are finding it challenging, see example - “There is very little flexibility in the ability to personalize. Many things you need to do need very specialized talent, that really doesn’t exist in the marketplace”

End users need to be trained in tool to get them to adoption. The cost of training varies greatly because it depends on how complex the tool is to learn and use and how long it takes for technical and non-technical data users to get onboarded and start benefiting from the data catalog.

Some public reviews indicate that’s product is not easy to use. One user said that “The user interface is somewhat dated. The learning curve can be steep for users who need to manage metadata resources”. One user from one of their marquee clients, WPP compared the user experience to “software from early 2010s. It feels dated”. This is an important factor to keep in mind as it might result in higher training cost to educate users

5. Ongoing tool maintenance and troubleshooting costs

Every software system needs maintenance work and issue troubleshooting. A data catalog is no different. To have a smooth experience, the tool should provide easy to troubleshoot features, an effective customer support team who are responsive to the tickets raised, and a good documentation. If any of them are off, then it means more engineering bandwidth spent in maintenance and troubleshooting, which adds to the cost.’s offers customer support.’s support has mixed feedback from public reviews with positive and negative sentiments. Another limitation is that currently only supports users in the US and EU regions.

Product maintenance has been unpleasant for some customers due to lack of transparency on the new product developments. A user quoted “we find ourselves needing to refactor and redesign things frequently to adjust things we had previously “customized” to instead use the new “out-of-the-box” features.. we’d appreciate better insight into the roadmap so that we do not spend significant time implementing custom resources and solutions that later must be redone”. A Return on investment?

If your enterprise prioritizes basic data catalog capabilities and your team has the bandwidth for user training, could be a viable option. However, if your focus is on empowering business teams with self-service data access, fostering collaboration across departments, and scaling your data management needs, you might require a platform with more robust features in those areas.

This is where a modern data catalog tool like Atlan comes in. By integrating seamlessly with a lot of existing tools, Atlan offers deeper functionality in discovery, lineage, security, and access control, and provides personalized support and adoption strategies.

  • Partner not Vendor Approach: Atlan goes beyond a simple vendor approach, acting as a partner to craft a custom data strategy and implementation plan for your organization. This partnership aims to accelerate adoption by achieving widespread use for the first data use case within 90 days, and it can even guide your team in adopting modern data management concepts like data mesh, data products, and data contracts.
  • Extensibility and Open Platform to Adapt to Enterprise Environments: Atlan boasts of an open platform designed for extensibility within enterprise environments. Their open API architecture and custom package options empower programmatic management of active metadata. This allows for automation of tasks like data deletion and proactive alerting, streamlining data governance for complex ecosystems.
  • Personalized User Experience for All: Unlike’s lack of customization capabilities, Atlan’s intuitive interface caters to diverse data personas. Business users, analysts, and data engineers can all leverage Atlan’s personalized dashboards, Chrome extension, and integrations with familiar tools like Slack, Teams, GitHub, and Excel. This drives broader user adoption and data democratization.
  • Active Metadata Approach for Modern Governance: Atlan champions active data governance, bringing metadata directly into users’ workflows for actionable insights. This aligns with the modern DataOps approach, enabling a shift-left strategy where data governance is integrated throughout the development lifecycle.

Share this article

[Website env: production]